Jan 31 09:01:00 crc systemd[1]: Starting Kubernetes Kubelet... Jan 31 09:01:00 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:00 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 09:01:01 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 31 09:01:02 crc kubenswrapper[4732]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:01:02 crc kubenswrapper[4732]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 31 09:01:02 crc kubenswrapper[4732]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:01:02 crc kubenswrapper[4732]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:01:02 crc kubenswrapper[4732]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 31 09:01:02 crc kubenswrapper[4732]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.228531 4732 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.231934 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.231955 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.231961 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.231967 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.231973 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.231978 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.231983 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.231988 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.231992 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232011 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232016 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232021 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232026 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232032 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232039 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232044 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232049 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232054 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232059 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232065 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232070 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232074 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232079 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232085 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232090 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232095 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232099 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232104 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232109 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232114 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232119 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232124 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232128 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232133 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232138 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232143 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232148 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232154 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232159 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232163 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232169 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232174 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232179 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232184 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232189 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232193 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232208 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232214 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232220 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232225 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232260 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232267 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232273 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232279 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232285 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232290 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232296 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232301 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232306 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232311 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232318 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232326 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232332 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232338 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232343 4732 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232348 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232354 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232359 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232364 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232372 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.232377 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232482 4732 flags.go:64] FLAG: --address="0.0.0.0" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232493 4732 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232503 4732 flags.go:64] FLAG: --anonymous-auth="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232511 4732 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232518 4732 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232524 4732 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232532 4732 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232539 4732 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232545 4732 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232551 4732 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232557 4732 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232564 4732 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232569 4732 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232575 4732 flags.go:64] FLAG: --cgroup-root="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232580 4732 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232586 4732 flags.go:64] FLAG: --client-ca-file="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232592 4732 flags.go:64] FLAG: --cloud-config="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232597 4732 flags.go:64] FLAG: --cloud-provider="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232603 4732 flags.go:64] FLAG: --cluster-dns="[]" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232610 4732 flags.go:64] FLAG: --cluster-domain="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232615 4732 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232621 4732 flags.go:64] FLAG: --config-dir="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232626 4732 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232633 4732 flags.go:64] FLAG: --container-log-max-files="5" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232640 4732 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232646 4732 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232651 4732 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232657 4732 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232683 4732 flags.go:64] FLAG: --contention-profiling="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232690 4732 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232697 4732 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232703 4732 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232709 4732 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232716 4732 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232721 4732 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232727 4732 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232732 4732 flags.go:64] FLAG: --enable-load-reader="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232738 4732 flags.go:64] FLAG: --enable-server="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232745 4732 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232752 4732 flags.go:64] FLAG: --event-burst="100" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232758 4732 flags.go:64] FLAG: --event-qps="50" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232764 4732 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232769 4732 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232775 4732 flags.go:64] FLAG: --eviction-hard="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232782 4732 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232787 4732 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232793 4732 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232798 4732 flags.go:64] FLAG: --eviction-soft="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232804 4732 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232810 4732 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232815 4732 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232821 4732 flags.go:64] FLAG: --experimental-mounter-path="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232827 4732 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232832 4732 flags.go:64] FLAG: --fail-swap-on="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232838 4732 flags.go:64] FLAG: --feature-gates="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232845 4732 flags.go:64] FLAG: --file-check-frequency="20s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232851 4732 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232856 4732 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232862 4732 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232868 4732 flags.go:64] FLAG: --healthz-port="10248" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232874 4732 flags.go:64] FLAG: --help="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232880 4732 flags.go:64] FLAG: --hostname-override="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232885 4732 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232891 4732 flags.go:64] FLAG: --http-check-frequency="20s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232897 4732 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232903 4732 flags.go:64] FLAG: --image-credential-provider-config="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232908 4732 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232914 4732 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232919 4732 flags.go:64] FLAG: --image-service-endpoint="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232925 4732 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232930 4732 flags.go:64] FLAG: --kube-api-burst="100" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232936 4732 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232942 4732 flags.go:64] FLAG: --kube-api-qps="50" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232947 4732 flags.go:64] FLAG: --kube-reserved="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232954 4732 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232959 4732 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232965 4732 flags.go:64] FLAG: --kubelet-cgroups="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232970 4732 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232975 4732 flags.go:64] FLAG: --lock-file="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232981 4732 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232987 4732 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.232992 4732 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233003 4732 flags.go:64] FLAG: --log-json-split-stream="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233009 4732 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233015 4732 flags.go:64] FLAG: --log-text-split-stream="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233020 4732 flags.go:64] FLAG: --logging-format="text" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233026 4732 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233032 4732 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233037 4732 flags.go:64] FLAG: --manifest-url="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233043 4732 flags.go:64] FLAG: --manifest-url-header="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233050 4732 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233056 4732 flags.go:64] FLAG: --max-open-files="1000000" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233063 4732 flags.go:64] FLAG: --max-pods="110" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233069 4732 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233075 4732 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233081 4732 flags.go:64] FLAG: --memory-manager-policy="None" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233088 4732 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233095 4732 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233102 4732 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233109 4732 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233126 4732 flags.go:64] FLAG: --node-status-max-images="50" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233133 4732 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233140 4732 flags.go:64] FLAG: --oom-score-adj="-999" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233147 4732 flags.go:64] FLAG: --pod-cidr="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233154 4732 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233165 4732 flags.go:64] FLAG: --pod-manifest-path="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233172 4732 flags.go:64] FLAG: --pod-max-pids="-1" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233179 4732 flags.go:64] FLAG: --pods-per-core="0" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233185 4732 flags.go:64] FLAG: --port="10250" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233192 4732 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233200 4732 flags.go:64] FLAG: --provider-id="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233206 4732 flags.go:64] FLAG: --qos-reserved="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233211 4732 flags.go:64] FLAG: --read-only-port="10255" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233217 4732 flags.go:64] FLAG: --register-node="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233225 4732 flags.go:64] FLAG: --register-schedulable="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233231 4732 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233241 4732 flags.go:64] FLAG: --registry-burst="10" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233246 4732 flags.go:64] FLAG: --registry-qps="5" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233251 4732 flags.go:64] FLAG: --reserved-cpus="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233257 4732 flags.go:64] FLAG: --reserved-memory="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233264 4732 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233270 4732 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233276 4732 flags.go:64] FLAG: --rotate-certificates="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233282 4732 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233287 4732 flags.go:64] FLAG: --runonce="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233294 4732 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233299 4732 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233306 4732 flags.go:64] FLAG: --seccomp-default="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233312 4732 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233317 4732 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233323 4732 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233328 4732 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233334 4732 flags.go:64] FLAG: --storage-driver-password="root" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233339 4732 flags.go:64] FLAG: --storage-driver-secure="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233345 4732 flags.go:64] FLAG: --storage-driver-table="stats" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233350 4732 flags.go:64] FLAG: --storage-driver-user="root" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233356 4732 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233362 4732 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233367 4732 flags.go:64] FLAG: --system-cgroups="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233373 4732 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233381 4732 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233387 4732 flags.go:64] FLAG: --tls-cert-file="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233392 4732 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233398 4732 flags.go:64] FLAG: --tls-min-version="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233404 4732 flags.go:64] FLAG: --tls-private-key-file="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233409 4732 flags.go:64] FLAG: --topology-manager-policy="none" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233418 4732 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233423 4732 flags.go:64] FLAG: --topology-manager-scope="container" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233429 4732 flags.go:64] FLAG: --v="2" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233437 4732 flags.go:64] FLAG: --version="false" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233444 4732 flags.go:64] FLAG: --vmodule="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233452 4732 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.233457 4732 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235762 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235774 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235780 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235787 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235793 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235799 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235809 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235814 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235819 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235824 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235829 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235834 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235839 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235843 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235848 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235853 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235858 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235863 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235869 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235876 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235881 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235887 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235893 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235899 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235904 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235912 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235917 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235922 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235927 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235933 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235938 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235944 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235949 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235953 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235958 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235963 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235968 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235974 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235982 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235988 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.235994 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236001 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236008 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236014 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236021 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236027 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236034 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236041 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236047 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236053 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236059 4732 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236065 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236072 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236077 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236081 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236086 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236091 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236098 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236103 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236108 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236112 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236117 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236122 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236127 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236131 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236137 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236142 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236147 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236152 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236157 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.236163 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.236179 4732 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.250258 4732 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.250314 4732 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250440 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250461 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250469 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250479 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250487 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250495 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250503 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250510 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250518 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250526 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250534 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250541 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250549 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250556 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250567 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250577 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250586 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250595 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250603 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250613 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250620 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250629 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250639 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250648 4732 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250656 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250692 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250702 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250712 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250730 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250747 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250759 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250773 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250787 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250799 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250810 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250821 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250832 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250844 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250853 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250863 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250874 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250884 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250894 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250905 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250915 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250925 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250935 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250945 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250955 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250965 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250975 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250986 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.250999 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251014 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251025 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251035 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251045 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251056 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251066 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251076 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251086 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251097 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251107 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251115 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251123 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251131 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251139 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251149 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251159 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251167 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251176 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.251190 4732 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251446 4732 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251461 4732 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251469 4732 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251478 4732 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251488 4732 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251496 4732 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251504 4732 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251512 4732 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251519 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251531 4732 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251538 4732 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251548 4732 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251555 4732 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251563 4732 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251572 4732 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251579 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251587 4732 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251595 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251602 4732 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251610 4732 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251618 4732 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251626 4732 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251633 4732 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251643 4732 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251654 4732 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251695 4732 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251704 4732 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251712 4732 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251721 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251729 4732 feature_gate.go:330] unrecognized feature gate: Example Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251737 4732 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251745 4732 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251753 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251761 4732 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251768 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251777 4732 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251785 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251793 4732 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251800 4732 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251808 4732 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251816 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251824 4732 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251832 4732 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251840 4732 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251847 4732 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251858 4732 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251868 4732 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251876 4732 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251884 4732 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251892 4732 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251900 4732 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251907 4732 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251915 4732 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251922 4732 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251930 4732 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251938 4732 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251946 4732 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251955 4732 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251965 4732 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251975 4732 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251987 4732 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.251999 4732 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.252009 4732 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.252020 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.252030 4732 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.252040 4732 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.252052 4732 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.252066 4732 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.252075 4732 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.252084 4732 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.252092 4732 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.252105 4732 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.253544 4732 server.go:940] "Client rotation is on, will bootstrap in background" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.263063 4732 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.264760 4732 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.267712 4732 server.go:997] "Starting client certificate rotation" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.267762 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.268171 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-16 08:04:47.393754687 +0000 UTC Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.268403 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.310387 4732 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.316412 4732 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.317768 4732 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.363820 4732 log.go:25] "Validated CRI v1 runtime API" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.426632 4732 log.go:25] "Validated CRI v1 image API" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.428762 4732 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.432828 4732 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-31-08-56-48-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.432858 4732 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.453183 4732 manager.go:217] Machine: {Timestamp:2026-01-31 09:01:02.448422399 +0000 UTC m=+0.754298623 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5f9a5b0b-6336-4588-8df8-98fcbdc2a984 BootID:2761c117-4c0c-4c53-891e-fc7b8fbd4017 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e9:ea:54 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e9:ea:54 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a7:20:e5 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:82:7d:97 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:31:34:d3 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:55:c7:46 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:b2:a8:3c:9e:8a:1a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ca:78:77:bd:9f:31 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.453477 4732 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.453652 4732 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.454122 4732 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.454304 4732 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.454347 4732 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.454570 4732 topology_manager.go:138] "Creating topology manager with none policy" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.454584 4732 container_manager_linux.go:303] "Creating device plugin manager" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.455590 4732 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.455625 4732 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.456394 4732 state_mem.go:36] "Initialized new in-memory state store" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.456482 4732 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.460093 4732 kubelet.go:418] "Attempting to sync node with API server" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.460132 4732 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.460170 4732 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.460187 4732 kubelet.go:324] "Adding apiserver pod source" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.460201 4732 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.466792 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.466886 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.467027 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.467136 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.469111 4732 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.470072 4732 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.472997 4732 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474871 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474901 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474910 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474919 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474934 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474943 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474951 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474964 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474975 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.474985 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.475001 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.475009 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.475915 4732 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.476645 4732 server.go:1280] "Started kubelet" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.476792 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.478151 4732 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.478142 4732 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 31 09:01:02 crc systemd[1]: Started Kubernetes Kubelet. Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.478967 4732 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.479611 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.479678 4732 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.479716 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 10:29:14.647276519 +0000 UTC Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.479865 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.480044 4732 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.480088 4732 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.480130 4732 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.482058 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.482153 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.482807 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="200ms" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.483794 4732 factory.go:55] Registering systemd factory Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.483838 4732 factory.go:221] Registration of the systemd container factory successfully Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.490333 4732 factory.go:153] Registering CRI-O factory Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.490419 4732 factory.go:221] Registration of the crio container factory successfully Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.490544 4732 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.490579 4732 factory.go:103] Registering Raw factory Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.490602 4732 manager.go:1196] Started watching for new ooms in manager Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.491300 4732 manager.go:319] Starting recovery of all containers Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.491767 4732 server.go:460] "Adding debug handlers to kubelet server" Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.492922 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.231:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188fc541379f1830 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:01:02.47661368 +0000 UTC m=+0.782489884,LastTimestamp:2026-01-31 09:01:02.47661368 +0000 UTC m=+0.782489884,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.503819 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.503888 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.503909 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.503926 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.503942 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.503955 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.503969 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.503985 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504000 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504012 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504059 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504072 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504086 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504105 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504121 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504134 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504147 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504160 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504173 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504187 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504225 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504241 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504263 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504276 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504289 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504302 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504315 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504331 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504343 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504354 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504372 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504419 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504433 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504443 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504455 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504467 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504478 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504488 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504499 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504510 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504521 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504532 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504543 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504554 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504565 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504576 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504589 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504600 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504610 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504621 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504632 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504744 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504763 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504773 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504784 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504793 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504803 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504812 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504824 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504865 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504875 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504885 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504895 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504907 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504920 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504930 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504940 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504952 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504962 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.504977 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505002 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505017 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505030 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505051 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505063 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505074 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505084 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505095 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505105 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505117 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505128 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505139 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505150 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505161 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505176 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505191 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505203 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505217 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505230 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505243 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505254 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505266 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505278 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505292 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505307 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505321 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505343 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505357 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505369 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505385 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505397 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505409 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505421 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505435 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505455 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505470 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505483 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505494 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505505 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505516 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505526 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505538 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505550 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505562 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505572 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505583 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505594 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505604 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505615 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505624 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505633 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505644 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505653 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505697 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505714 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505725 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505734 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505745 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505758 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505769 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505779 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505792 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505815 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505831 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.505845 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.510357 4732 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511341 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511368 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511389 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511403 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511416 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511428 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511446 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511461 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511476 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511490 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511501 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511513 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511690 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.511709 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512025 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512049 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512063 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512076 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512089 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512101 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512114 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512127 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512140 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512157 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512171 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512185 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512201 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512216 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512230 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512272 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512290 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512303 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512316 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512329 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512341 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512354 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.512365 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513275 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513310 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513326 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513342 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513355 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513370 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513383 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513395 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513407 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513419 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513432 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513447 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513460 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513474 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513487 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513502 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513515 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513528 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513544 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513557 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513569 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513585 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513598 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513612 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513626 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513641 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513654 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513691 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513705 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513717 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513730 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513743 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513754 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513769 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513782 4732 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513794 4732 reconstruct.go:97] "Volume reconstruction finished" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.513804 4732 reconciler.go:26] "Reconciler: start to sync state" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.514132 4732 manager.go:324] Recovery completed Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.524992 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.526770 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.526958 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.527041 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.527808 4732 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.527828 4732 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.527848 4732 state_mem.go:36] "Initialized new in-memory state store" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.539036 4732 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.541110 4732 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.541282 4732 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.541362 4732 kubelet.go:2335] "Starting kubelet main sync loop" Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.541554 4732 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 31 09:01:02 crc kubenswrapper[4732]: W0131 09:01:02.545922 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.546009 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.572611 4732 policy_none.go:49] "None policy: Start" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.574684 4732 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.574747 4732 state_mem.go:35] "Initializing new in-memory state store" Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.580817 4732 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.633547 4732 manager.go:334] "Starting Device Plugin manager" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.633740 4732 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.633760 4732 server.go:79] "Starting device plugin registration server" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.634233 4732 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.634254 4732 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.634476 4732 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.634550 4732 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.634557 4732 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.642268 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.643466 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.643743 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.645891 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.645933 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.645949 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.646174 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.646409 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.646484 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.647454 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.647486 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.647500 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.647588 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.647801 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.647874 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.648118 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.648162 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.648179 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.648586 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.648621 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.648633 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.648816 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.648940 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.648983 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650093 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650141 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650217 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650230 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650240 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650280 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650489 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650564 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650961 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.650972 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.651165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.651190 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.651217 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.651392 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.651428 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.652406 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.652428 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.652437 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.652451 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.652504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.652515 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.684354 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="400ms" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.716639 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.716711 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.716740 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.716785 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.716806 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.716847 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.716926 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.716969 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.716993 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.717025 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.717053 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.717083 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.717105 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.717124 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.717141 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.734505 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.735834 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.735883 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.735900 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.735931 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.736461 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.231:6443: connect: connection refused" node="crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818027 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818091 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818108 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818123 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818155 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818171 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818187 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818202 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818218 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818231 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818224 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818270 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818294 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818299 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818323 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818246 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818281 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818245 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818337 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818366 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818331 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818347 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818507 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818557 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818582 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818614 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818700 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818709 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818713 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.818743 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.937597 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.939824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.939870 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.939882 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.939909 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:01:02 crc kubenswrapper[4732]: E0131 09:01:02.940389 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.231:6443: connect: connection refused" node="crc" Jan 31 09:01:02 crc kubenswrapper[4732]: I0131 09:01:02.995554 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.016635 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.037148 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.044349 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.048033 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:03 crc kubenswrapper[4732]: E0131 09:01:03.085747 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="800ms" Jan 31 09:01:03 crc kubenswrapper[4732]: W0131 09:01:03.135444 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-69f12eef6fb6e235cac8ad04bca66ee300952ad6934c65f7ca40d82cd0de8532 WatchSource:0}: Error finding container 69f12eef6fb6e235cac8ad04bca66ee300952ad6934c65f7ca40d82cd0de8532: Status 404 returned error can't find the container with id 69f12eef6fb6e235cac8ad04bca66ee300952ad6934c65f7ca40d82cd0de8532 Jan 31 09:01:03 crc kubenswrapper[4732]: W0131 09:01:03.137350 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a5f6ba4146ca4fd45a86b175655c87df100727d86a0bc7fd3f99a3c92a90f415 WatchSource:0}: Error finding container a5f6ba4146ca4fd45a86b175655c87df100727d86a0bc7fd3f99a3c92a90f415: Status 404 returned error can't find the container with id a5f6ba4146ca4fd45a86b175655c87df100727d86a0bc7fd3f99a3c92a90f415 Jan 31 09:01:03 crc kubenswrapper[4732]: W0131 09:01:03.138007 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-fd98298222cb06cca26e7e01f6e6a136f1f474083d126c37e5a1766668a8b049 WatchSource:0}: Error finding container fd98298222cb06cca26e7e01f6e6a136f1f474083d126c37e5a1766668a8b049: Status 404 returned error can't find the container with id fd98298222cb06cca26e7e01f6e6a136f1f474083d126c37e5a1766668a8b049 Jan 31 09:01:03 crc kubenswrapper[4732]: W0131 09:01:03.138701 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-0df22fa47819f792794c8f59b16fedf553f1b0f61ca7d9b7a7e7183322bc1df8 WatchSource:0}: Error finding container 0df22fa47819f792794c8f59b16fedf553f1b0f61ca7d9b7a7e7183322bc1df8: Status 404 returned error can't find the container with id 0df22fa47819f792794c8f59b16fedf553f1b0f61ca7d9b7a7e7183322bc1df8 Jan 31 09:01:03 crc kubenswrapper[4732]: W0131 09:01:03.140193 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a679f267b1a91b098a7388467a2bed47ca2a5417419f81d68cb7d18ed0f7312a WatchSource:0}: Error finding container a679f267b1a91b098a7388467a2bed47ca2a5417419f81d68cb7d18ed0f7312a: Status 404 returned error can't find the container with id a679f267b1a91b098a7388467a2bed47ca2a5417419f81d68cb7d18ed0f7312a Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.340536 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.342031 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.342073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.342085 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.342111 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:01:03 crc kubenswrapper[4732]: E0131 09:01:03.342543 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.231:6443: connect: connection refused" node="crc" Jan 31 09:01:03 crc kubenswrapper[4732]: W0131 09:01:03.380348 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:03 crc kubenswrapper[4732]: E0131 09:01:03.380439 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:03 crc kubenswrapper[4732]: W0131 09:01:03.430713 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:03 crc kubenswrapper[4732]: E0131 09:01:03.430800 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.477977 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.480058 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 15:02:44.562395335 +0000 UTC Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.547273 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0df22fa47819f792794c8f59b16fedf553f1b0f61ca7d9b7a7e7183322bc1df8"} Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.548328 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a5f6ba4146ca4fd45a86b175655c87df100727d86a0bc7fd3f99a3c92a90f415"} Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.549327 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a679f267b1a91b098a7388467a2bed47ca2a5417419f81d68cb7d18ed0f7312a"} Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.550326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"69f12eef6fb6e235cac8ad04bca66ee300952ad6934c65f7ca40d82cd0de8532"} Jan 31 09:01:03 crc kubenswrapper[4732]: I0131 09:01:03.551944 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fd98298222cb06cca26e7e01f6e6a136f1f474083d126c37e5a1766668a8b049"} Jan 31 09:01:03 crc kubenswrapper[4732]: W0131 09:01:03.694442 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:03 crc kubenswrapper[4732]: E0131 09:01:03.694536 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:03 crc kubenswrapper[4732]: W0131 09:01:03.791554 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:03 crc kubenswrapper[4732]: E0131 09:01:03.791658 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:03 crc kubenswrapper[4732]: E0131 09:01:03.887099 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="1.6s" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.143309 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.144567 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.144607 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.144620 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.144653 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:01:04 crc kubenswrapper[4732]: E0131 09:01:04.145218 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.231:6443: connect: connection refused" node="crc" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.356723 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 09:01:04 crc kubenswrapper[4732]: E0131 09:01:04.357869 4732 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.478264 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.481227 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:47:17.581287173 +0000 UTC Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.556632 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675"} Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.556696 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.556950 4732 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675" exitCode=0 Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.557905 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.557963 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.557984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.558780 4732 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3" exitCode=0 Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.558885 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3"} Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.558974 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.560316 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.560340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.560350 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.562870 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da"} Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.564783 4732 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2" exitCode=0 Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.564832 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.564840 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2"} Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.565544 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.565576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.565588 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.567562 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f" exitCode=0 Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.567615 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f"} Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.568401 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.570791 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.570830 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.570840 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.574062 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.575078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.575129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:04 crc kubenswrapper[4732]: I0131 09:01:04.575142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:05 crc kubenswrapper[4732]: W0131 09:01:05.076342 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:05 crc kubenswrapper[4732]: E0131 09:01:05.076473 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.478046 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.481512 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 13:05:17.307762429 +0000 UTC Jan 31 09:01:05 crc kubenswrapper[4732]: E0131 09:01:05.488417 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="3.2s" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.572599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7"} Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.572686 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd"} Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.573978 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a"} Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.574001 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1"} Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.575127 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"87eb9563b2ca7420cbb70ccd32aa77f2a82c769d57eceacd99a4d94cb1c3a0d7"} Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.575241 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.575887 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.575910 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.575918 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.577806 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca"} Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.577858 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d"} Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.579718 4732 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8" exitCode=0 Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.579778 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8"} Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.579863 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.580744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.580772 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.580783 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.745531 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.746863 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.746894 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.746904 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:05 crc kubenswrapper[4732]: I0131 09:01:05.746930 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:01:05 crc kubenswrapper[4732]: E0131 09:01:05.747375 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.231:6443: connect: connection refused" node="crc" Jan 31 09:01:05 crc kubenswrapper[4732]: W0131 09:01:05.881701 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:05 crc kubenswrapper[4732]: E0131 09:01:05.881798 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:06 crc kubenswrapper[4732]: W0131 09:01:06.024372 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:06 crc kubenswrapper[4732]: E0131 09:01:06.024451 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:06 crc kubenswrapper[4732]: W0131 09:01:06.470745 4732 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:06 crc kubenswrapper[4732]: E0131 09:01:06.470852 4732 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.231:6443: connect: connection refused" logger="UnhandledError" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.481129 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.482033 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 06:52:00.740642609 +0000 UTC Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.588732 4732 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf" exitCode=0 Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.588874 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.588873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf"} Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.589949 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.589992 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.590009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.593346 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"849206839f7a8bd7f8a6a6fa78eda460866d8039008158f82dcdc4743bd4eed2"} Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.593378 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9"} Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.593396 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722"} Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.593457 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.594444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.594483 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.594493 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.596380 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.596371 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e"} Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.597027 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.597062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.597073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.599067 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.599162 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.599064 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a"} Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.600163 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.600212 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.600226 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.600176 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.600274 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:06 crc kubenswrapper[4732]: I0131 09:01:06.600305 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.242032 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.478327 4732 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.231:6443: connect: connection refused Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.483118 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 05:07:59.577375861 +0000 UTC Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.607064 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c"} Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.607166 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3"} Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.607182 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2"} Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.609095 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.610915 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="849206839f7a8bd7f8a6a6fa78eda460866d8039008158f82dcdc4743bd4eed2" exitCode=255 Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.611052 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.611052 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"849206839f7a8bd7f8a6a6fa78eda460866d8039008158f82dcdc4743bd4eed2"} Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.611088 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.611135 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.611202 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.612342 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.612370 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.612381 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.612750 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.612781 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.612793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.612836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.612866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.612881 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:07 crc kubenswrapper[4732]: I0131 09:01:07.613444 4732 scope.go:117] "RemoveContainer" containerID="849206839f7a8bd7f8a6a6fa78eda460866d8039008158f82dcdc4743bd4eed2" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.472754 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.483749 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 13:17:32.324285501 +0000 UTC Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.525917 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.616840 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.620272 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e"} Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.620380 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.620453 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.621723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.621770 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.621787 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.625940 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1"} Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.625994 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134"} Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.626041 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.626061 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.627217 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.627273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.627233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.627313 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.627328 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.627290 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.947849 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.950059 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.950119 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.950134 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:08 crc kubenswrapper[4732]: I0131 09:01:08.950166 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.015612 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.386571 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.484203 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 11:12:59.48013823 +0000 UTC Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.628274 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.628384 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.628411 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.628493 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.629783 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.629820 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.629832 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.629839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.629891 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.629914 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.629982 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.630041 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:09 crc kubenswrapper[4732]: I0131 09:01:09.630061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:10 crc kubenswrapper[4732]: I0131 09:01:10.242493 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 09:01:10 crc kubenswrapper[4732]: I0131 09:01:10.242583 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 09:01:10 crc kubenswrapper[4732]: I0131 09:01:10.484830 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 12:13:20.56940044 +0000 UTC Jan 31 09:01:10 crc kubenswrapper[4732]: I0131 09:01:10.630973 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:01:10 crc kubenswrapper[4732]: I0131 09:01:10.631053 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:10 crc kubenswrapper[4732]: I0131 09:01:10.632090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:10 crc kubenswrapper[4732]: I0131 09:01:10.632129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:10 crc kubenswrapper[4732]: I0131 09:01:10.632141 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.485599 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 20:30:21.108816923 +0000 UTC Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.713881 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.714125 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.715635 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.715697 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.715709 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.916629 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.916891 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.918264 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.918354 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:11 crc kubenswrapper[4732]: I0131 09:01:11.918381 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.151427 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.151703 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.153206 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.153248 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.153264 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.486521 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 19:18:37.037660578 +0000 UTC Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.631767 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.632027 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.633449 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.633525 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:12 crc kubenswrapper[4732]: I0131 09:01:12.633543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:12 crc kubenswrapper[4732]: E0131 09:01:12.642438 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.381195 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.381449 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.383068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.383138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.383152 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.386917 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.487245 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:52:15.00677578 +0000 UTC Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.638980 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.640214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.640276 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.640295 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.780032 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.780424 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.781934 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.782004 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:13 crc kubenswrapper[4732]: I0131 09:01:13.782027 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:14 crc kubenswrapper[4732]: I0131 09:01:14.488248 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 19:30:41.741783079 +0000 UTC Jan 31 09:01:15 crc kubenswrapper[4732]: I0131 09:01:15.489149 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 18:50:43.566853662 +0000 UTC Jan 31 09:01:16 crc kubenswrapper[4732]: I0131 09:01:16.489736 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 18:50:17.969224935 +0000 UTC Jan 31 09:01:17 crc kubenswrapper[4732]: I0131 09:01:17.361560 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 31 09:01:17 crc kubenswrapper[4732]: I0131 09:01:17.361653 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 31 09:01:17 crc kubenswrapper[4732]: I0131 09:01:17.490407 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 12:47:42.61310711 +0000 UTC Jan 31 09:01:18 crc kubenswrapper[4732]: I0131 09:01:18.365402 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 31 09:01:18 crc kubenswrapper[4732]: I0131 09:01:18.365477 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 09:01:18 crc kubenswrapper[4732]: I0131 09:01:18.369186 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Jan 31 09:01:18 crc kubenswrapper[4732]: I0131 09:01:18.369251 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 09:01:18 crc kubenswrapper[4732]: I0131 09:01:18.476824 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]log ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]etcd ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/priority-and-fairness-filter ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/start-apiextensions-informers ok Jan 31 09:01:18 crc kubenswrapper[4732]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Jan 31 09:01:18 crc kubenswrapper[4732]: [-]poststarthook/crd-informer-synced failed: reason withheld Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/start-system-namespaces-controller ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 31 09:01:18 crc kubenswrapper[4732]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 31 09:01:18 crc kubenswrapper[4732]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/bootstrap-controller ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/start-kube-aggregator-informers ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 31 09:01:18 crc kubenswrapper[4732]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 31 09:01:18 crc kubenswrapper[4732]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]autoregister-completion ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/apiservice-openapi-controller ok Jan 31 09:01:18 crc kubenswrapper[4732]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 31 09:01:18 crc kubenswrapper[4732]: livez check failed Jan 31 09:01:18 crc kubenswrapper[4732]: I0131 09:01:18.476883 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:01:18 crc kubenswrapper[4732]: I0131 09:01:18.491174 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 15:39:50.075828495 +0000 UTC Jan 31 09:01:19 crc kubenswrapper[4732]: I0131 09:01:19.491275 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 10:46:27.266453574 +0000 UTC Jan 31 09:01:20 crc kubenswrapper[4732]: I0131 09:01:20.243030 4732 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 09:01:20 crc kubenswrapper[4732]: I0131 09:01:20.243139 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 09:01:20 crc kubenswrapper[4732]: I0131 09:01:20.492020 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 09:24:08.715428717 +0000 UTC Jan 31 09:01:21 crc kubenswrapper[4732]: I0131 09:01:21.492791 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 20:08:04.135043559 +0000 UTC Jan 31 09:01:22 crc kubenswrapper[4732]: I0131 09:01:22.158021 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:22 crc kubenswrapper[4732]: I0131 09:01:22.158174 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:22 crc kubenswrapper[4732]: I0131 09:01:22.159469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:22 crc kubenswrapper[4732]: I0131 09:01:22.159524 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:22 crc kubenswrapper[4732]: I0131 09:01:22.159534 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:22 crc kubenswrapper[4732]: I0131 09:01:22.493877 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 01:40:40.655878001 +0000 UTC Jan 31 09:01:22 crc kubenswrapper[4732]: E0131 09:01:22.642702 4732 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.362968 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.364864 4732 trace.go:236] Trace[766566254]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:01:11.731) (total time: 11632ms): Jan 31 09:01:23 crc kubenswrapper[4732]: Trace[766566254]: ---"Objects listed" error: 11632ms (09:01:23.364) Jan 31 09:01:23 crc kubenswrapper[4732]: Trace[766566254]: [11.632994324s] [11.632994324s] END Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.364892 4732 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.366219 4732 trace.go:236] Trace[445597049]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:01:10.233) (total time: 13132ms): Jan 31 09:01:23 crc kubenswrapper[4732]: Trace[445597049]: ---"Objects listed" error: 13132ms (09:01:23.366) Jan 31 09:01:23 crc kubenswrapper[4732]: Trace[445597049]: [13.132602516s] [13.132602516s] END Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.366246 4732 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.367309 4732 trace.go:236] Trace[165649746]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:01:08.647) (total time: 14720ms): Jan 31 09:01:23 crc kubenswrapper[4732]: Trace[165649746]: ---"Objects listed" error: 14720ms (09:01:23.367) Jan 31 09:01:23 crc kubenswrapper[4732]: Trace[165649746]: [14.720087412s] [14.720087412s] END Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.367374 4732 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.368295 4732 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.368477 4732 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.369284 4732 trace.go:236] Trace[551810421]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 09:01:11.298) (total time: 12070ms): Jan 31 09:01:23 crc kubenswrapper[4732]: Trace[551810421]: ---"Objects listed" error: 12070ms (09:01:23.369) Jan 31 09:01:23 crc kubenswrapper[4732]: Trace[551810421]: [12.070646424s] [12.070646424s] END Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.369311 4732 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.373081 4732 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.436248 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33680->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.436341 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:33680->192.168.126.11:17697: read: connection reset by peer" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.471741 4732 apiserver.go:52] "Watching apiserver" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.475844 4732 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.476243 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.476744 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.476750 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.476850 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.477115 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.477170 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.477221 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.477238 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.477294 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.477534 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.478676 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.478954 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.479067 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.479127 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.479282 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.479317 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.480925 4732 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.481843 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.482124 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.482474 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.482745 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.482975 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.483146 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.483431 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.494028 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 17:56:25.651591047 +0000 UTC Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.504113 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.514095 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.520912 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.526798 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.536618 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.546808 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.558410 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.568711 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570225 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570282 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570308 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570332 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570356 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570388 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570418 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570442 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570467 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.570510 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:01:24.07046852 +0000 UTC m=+22.376344734 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570575 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570617 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570644 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570693 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570717 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570739 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570766 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570792 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570818 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570841 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570870 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570896 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570891 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570923 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571000 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571030 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571055 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571078 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571107 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571130 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571153 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571175 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571201 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571225 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571248 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571283 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571308 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571335 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571358 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571380 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571405 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571429 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571453 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571477 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571542 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571566 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571591 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571647 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571686 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571707 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571726 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571746 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571802 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571825 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571847 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571876 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571901 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570900 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571978 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572004 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572023 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.570920 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571159 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572074 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571135 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571199 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571223 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571284 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571470 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571485 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571636 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571743 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571808 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571829 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571839 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571905 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.571947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572087 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572257 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572317 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572042 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572409 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572441 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572468 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572493 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572517 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572561 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572569 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572586 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572609 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572623 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572634 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572683 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572729 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572757 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572757 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572787 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572823 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572847 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572872 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572893 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572917 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572938 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572959 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572958 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.572981 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573004 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573028 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573049 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573071 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573092 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573112 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573136 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573157 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573156 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573141 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573158 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573202 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573185 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573307 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573347 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573352 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573368 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573445 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573484 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573484 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573506 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573527 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573551 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573572 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573578 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573596 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573744 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573749 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573776 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573802 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573826 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573850 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.573873 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.574998 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575032 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575062 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575120 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575159 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575184 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575214 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575239 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575259 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575283 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575310 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575359 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575524 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575727 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575743 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575857 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.576048 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.576098 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.576457 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.578262 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.578341 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.578341 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.578371 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.578933 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.575335 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.578985 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579067 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579133 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579216 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579262 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579295 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579330 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579358 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579395 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579427 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579461 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579491 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579524 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579556 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579584 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579640 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579703 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579927 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579953 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579982 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580011 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580039 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580070 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580098 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580124 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580156 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580185 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580222 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580256 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580288 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580322 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580356 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580399 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580451 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580507 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580556 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580608 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580654 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580713 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580759 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580803 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580838 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580885 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580930 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580973 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.581805 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.581878 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.581920 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579166 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579340 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579379 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579558 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.582295 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579605 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.579966 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580228 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580526 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580634 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580717 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.580876 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.578601 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.581581 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.581894 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.582054 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.582273 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.582273 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.582825 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.583209 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.583222 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.583455 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.584335 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.584345 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.584548 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.584611 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.584627 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.584983 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.585516 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.585605 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.585781 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.585947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.585870 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.586026 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.586117 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.586139 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.586333 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.586352 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.586380 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.586429 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.587087 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.587715 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.589190 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.589309 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.589474 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.589899 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.589964 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590007 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590098 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590144 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590184 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590267 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590310 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590351 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590381 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590421 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590352 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590451 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590440 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590496 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590566 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590710 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.591222 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.591393 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.591470 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.591558 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.591722 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.592950 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.593004 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.593030 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.593051 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.593083 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.593117 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.593305 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.593340 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.590979 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.591985 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.591996 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.592014 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.592139 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.592168 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.592449 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.594064 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.592459 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.592870 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.593074 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.593099 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.591858 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.594275 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.594516 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.594849 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595032 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595129 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595181 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595436 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595598 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595712 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595765 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595785 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595842 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.595907 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596206 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596224 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596233 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596314 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596713 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596750 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596803 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596850 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596882 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.596930 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597077 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597218 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597276 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597425 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597413 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597475 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597493 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597595 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597788 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597816 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597903 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597936 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.597803 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598157 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598142 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598264 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598535 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598563 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598575 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598715 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598749 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598784 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598811 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598841 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598906 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598948 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.598981 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599016 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599047 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599072 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599096 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599119 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599142 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599166 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599188 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599244 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599260 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599274 4732 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599289 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599303 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599315 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599327 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599340 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599353 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599365 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599377 4732 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599390 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599404 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599418 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599431 4732 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599445 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599457 4732 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599470 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599482 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599494 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.599016 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.599593 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:24.099567208 +0000 UTC m=+22.405443412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.599900 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.599970 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:24.09994946 +0000 UTC m=+22.405825754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.599993 4732 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.600948 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.601459 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.602366 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.602724 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.602814 4732 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604556 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604586 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604603 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604620 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604635 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604649 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604677 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604800 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604814 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604828 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604840 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604854 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604865 4732 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604877 4732 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604889 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604900 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604938 4732 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604951 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604964 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604977 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604987 4732 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605060 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605073 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605087 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605118 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605131 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605144 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605173 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605185 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605334 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605351 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605364 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605376 4732 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605389 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605400 4732 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605410 4732 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605421 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605433 4732 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605445 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605458 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605470 4732 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605482 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605496 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605508 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605520 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605531 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605544 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605557 4732 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605570 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605583 4732 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605594 4732 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605607 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605619 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605630 4732 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605642 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605652 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605693 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605707 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605719 4732 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605729 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605740 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.602805 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.603061 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.603077 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.603598 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.603743 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.603746 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.603879 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604117 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605751 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605909 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605941 4732 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605952 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605963 4732 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605973 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.605983 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606013 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606024 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606033 4732 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606044 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606054 4732 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606067 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606102 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606113 4732 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606123 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606132 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606142 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606169 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606182 4732 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606197 4732 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606208 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606220 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606255 4732 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606268 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606280 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606292 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606303 4732 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606337 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606354 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606369 4732 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606381 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606414 4732 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606427 4732 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606440 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606452 4732 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606464 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606474 4732 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606512 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606524 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606536 4732 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606548 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606583 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606596 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606609 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606622 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606634 4732 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606680 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606693 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606708 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606718 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606754 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606766 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606777 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606790 4732 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606801 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606836 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606850 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606862 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606873 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606912 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606931 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606942 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606955 4732 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606968 4732 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.606979 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.604503 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.611481 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.612593 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.612616 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.612641 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.612642 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.612674 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.612756 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:24.112728875 +0000 UTC m=+22.418605079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.612738 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.612824 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.612993 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.616575 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.616750 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.617568 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.617768 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.617787 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.617607 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.617967 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.617998 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.618017 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.618274 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.618293 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:24.118249808 +0000 UTC m=+22.424126082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.618280 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.618369 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.618982 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.619263 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.619782 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.620461 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.625600 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.625785 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.625909 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.626031 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.626077 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.626114 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.626418 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.626513 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.627024 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.630192 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.631158 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.634736 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.634778 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.635328 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.636010 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.650625 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.661178 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.662222 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.666822 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.667351 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.668218 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.669970 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e" exitCode=255 Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.670045 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e"} Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.670119 4732 scope.go:117] "RemoveContainer" containerID="849206839f7a8bd7f8a6a6fa78eda460866d8039008158f82dcdc4743bd4eed2" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.681101 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.681433 4732 scope.go:117] "RemoveContainer" containerID="c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e" Jan 31 09:01:23 crc kubenswrapper[4732]: E0131 09:01:23.682558 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.685742 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.697210 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.707128 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.707813 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.707901 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.707960 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.707960 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.707981 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708000 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708017 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708033 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708073 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708122 4732 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708143 4732 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708152 4732 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708162 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708170 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708180 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708191 4732 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708228 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708250 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708265 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708282 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708297 4732 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708312 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708327 4732 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708341 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708355 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708369 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708384 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708439 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708456 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708472 4732 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708518 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708533 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708548 4732 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708563 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708622 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708637 4732 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708653 4732 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708718 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708734 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708749 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708797 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708812 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708824 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708866 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708886 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708897 4732 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708917 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708957 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708971 4732 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.708982 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.716555 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.728703 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://849206839f7a8bd7f8a6a6fa78eda460866d8039008158f82dcdc4743bd4eed2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:07Z\\\",\\\"message\\\":\\\"W0131 09:01:06.636539 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0131 09:01:06.636963 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769850066 cert, and key in /tmp/serving-cert-2193001133/serving-signer.crt, /tmp/serving-cert-2193001133/serving-signer.key\\\\nI0131 09:01:06.851895 1 observer_polling.go:159] Starting file observer\\\\nW0131 09:01:06.856742 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:01:06.856927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:06.857621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193001133/tls.crt::/tmp/serving-cert-2193001133/tls.key\\\\\\\"\\\\nF0131 09:01:07.202355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.740249 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.750493 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.799745 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.804460 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.815099 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.816770 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.817165 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.822953 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.823722 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.826952 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: W0131 09:01:23.831539 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c78415f5f73bd74f106115622a3e6ff0efd80249c0ec92e76df9492aa9376025 WatchSource:0}: Error finding container c78415f5f73bd74f106115622a3e6ff0efd80249c0ec92e76df9492aa9376025: Status 404 returned error can't find the container with id c78415f5f73bd74f106115622a3e6ff0efd80249c0ec92e76df9492aa9376025 Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.838023 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: W0131 09:01:23.844115 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-73dfb0efcbb71b6296024c863031fb7180dfca4437ee985ef84d063cf78fd984 WatchSource:0}: Error finding container 73dfb0efcbb71b6296024c863031fb7180dfca4437ee985ef84d063cf78fd984: Status 404 returned error can't find the container with id 73dfb0efcbb71b6296024c863031fb7180dfca4437ee985ef84d063cf78fd984 Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.851069 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://849206839f7a8bd7f8a6a6fa78eda460866d8039008158f82dcdc4743bd4eed2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:07Z\\\",\\\"message\\\":\\\"W0131 09:01:06.636539 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0131 09:01:06.636963 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769850066 cert, and key in /tmp/serving-cert-2193001133/serving-signer.crt, /tmp/serving-cert-2193001133/serving-signer.key\\\\nI0131 09:01:06.851895 1 observer_polling.go:159] Starting file observer\\\\nW0131 09:01:06.856742 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:01:06.856927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:06.857621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193001133/tls.crt::/tmp/serving-cert-2193001133/tls.key\\\\\\\"\\\\nF0131 09:01:07.202355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.863020 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.872948 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.881632 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.901647 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.915751 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://849206839f7a8bd7f8a6a6fa78eda460866d8039008158f82dcdc4743bd4eed2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:07Z\\\",\\\"message\\\":\\\"W0131 09:01:06.636539 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0131 09:01:06.636963 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769850066 cert, and key in /tmp/serving-cert-2193001133/serving-signer.crt, /tmp/serving-cert-2193001133/serving-signer.key\\\\nI0131 09:01:06.851895 1 observer_polling.go:159] Starting file observer\\\\nW0131 09:01:06.856742 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:01:06.856927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:06.857621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193001133/tls.crt::/tmp/serving-cert-2193001133/tls.key\\\\\\\"\\\\nF0131 09:01:07.202355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.927504 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.936452 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.945209 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.953445 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.962203 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:23 crc kubenswrapper[4732]: I0131 09:01:23.971135 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.112168 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.112260 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.112285 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.112362 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.112369 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:01:25.112344289 +0000 UTC m=+23.418220513 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.112398 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.112405 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:25.112396881 +0000 UTC m=+23.418273085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.112462 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:25.112448662 +0000 UTC m=+23.418324876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.212925 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.213012 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.213167 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.213169 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.213211 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.213237 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.213184 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.213322 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.213328 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:25.213299904 +0000 UTC m=+23.519176138 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.213367 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:25.213354296 +0000 UTC m=+23.519230510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.495162 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 01:51:34.399556313 +0000 UTC Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.549129 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.550530 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.552924 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.554584 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.556587 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.557362 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.558241 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.559800 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.560994 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.562599 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.563602 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.566062 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.567182 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.568220 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.569805 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.570641 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.572053 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.572753 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.573556 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.576469 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.577225 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.578769 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.579408 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.581197 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.581911 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.582504 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.583684 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.584223 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.585921 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.587047 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.588094 4732 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.588365 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.591073 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.592335 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.593182 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.595900 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.599531 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.600317 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.601959 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.602877 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.603955 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.604912 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.606285 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.607150 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.608213 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.608903 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.609933 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.610649 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.611637 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.612120 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.613049 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.613565 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.614237 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.615140 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.676272 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"73dfb0efcbb71b6296024c863031fb7180dfca4437ee985ef84d063cf78fd984"} Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.680481 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b"} Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.680530 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d"} Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.680552 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c78415f5f73bd74f106115622a3e6ff0efd80249c0ec92e76df9492aa9376025"} Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.682407 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60"} Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.682444 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"efee6db2c1ab7fee455d0c74ac27cef0a722fee52555e5f007cb579a2e59ddbb"} Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.686341 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.695012 4732 scope.go:117] "RemoveContainer" containerID="c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e" Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.695289 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 31 09:01:24 crc kubenswrapper[4732]: E0131 09:01:24.705967 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.723002 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.738880 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://849206839f7a8bd7f8a6a6fa78eda460866d8039008158f82dcdc4743bd4eed2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:07Z\\\",\\\"message\\\":\\\"W0131 09:01:06.636539 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0131 09:01:06.636963 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769850066 cert, and key in /tmp/serving-cert-2193001133/serving-signer.crt, /tmp/serving-cert-2193001133/serving-signer.key\\\\nI0131 09:01:06.851895 1 observer_polling.go:159] Starting file observer\\\\nW0131 09:01:06.856742 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0131 09:01:06.856927 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:06.857621 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2193001133/tls.crt::/tmp/serving-cert-2193001133/tls.key\\\\\\\"\\\\nF0131 09:01:07.202355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.758843 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.779790 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.799860 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.817385 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.834520 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.847707 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.871359 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.888623 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.909175 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.924088 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.938761 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.952501 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.965899 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:24 crc kubenswrapper[4732]: I0131 09:01:24.979294 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:24Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:25 crc kubenswrapper[4732]: I0131 09:01:25.124387 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:25 crc kubenswrapper[4732]: I0131 09:01:25.124558 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.124617 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:01:27.124582608 +0000 UTC m=+25.430458832 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:01:25 crc kubenswrapper[4732]: I0131 09:01:25.124683 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.124748 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.124784 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.124816 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:27.124797595 +0000 UTC m=+25.430673839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.124840 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:27.124827416 +0000 UTC m=+25.430703630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:25 crc kubenswrapper[4732]: I0131 09:01:25.225995 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:25 crc kubenswrapper[4732]: I0131 09:01:25.226074 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.226187 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.226232 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.226252 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.226193 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.226325 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:27.226299669 +0000 UTC m=+25.532175893 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.226343 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.226359 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.226399 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:27.226385592 +0000 UTC m=+25.532261816 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:25 crc kubenswrapper[4732]: I0131 09:01:25.495614 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 19:31:09.62740828 +0000 UTC Jan 31 09:01:25 crc kubenswrapper[4732]: I0131 09:01:25.542499 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:25 crc kubenswrapper[4732]: I0131 09:01:25.542616 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.542690 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:25 crc kubenswrapper[4732]: I0131 09:01:25.542618 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.542811 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:25 crc kubenswrapper[4732]: E0131 09:01:25.543010 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.496611 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 02:45:47.948949585 +0000 UTC Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.702903 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217"} Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.728125 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.750553 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.778751 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.800387 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.817357 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.832563 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.844879 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:26 crc kubenswrapper[4732]: I0131 09:01:26.857530 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:26Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.059224 4732 csr.go:261] certificate signing request csr-xgd9j is approved, waiting to be issued Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.102860 4732 csr.go:257] certificate signing request csr-xgd9j is issued Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.132071 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-nsgpk"] Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.132493 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nsgpk" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.135274 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.135487 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.143330 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.143486 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:01:31.143448511 +0000 UTC m=+29.449324725 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.143576 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.143623 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.143754 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.143804 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:31.143795863 +0000 UTC m=+29.449672067 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.143818 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.143916 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:31.143888246 +0000 UTC m=+29.449764520 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.148314 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.174494 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.198776 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.225067 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.232595 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bllbs"] Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.232963 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.235107 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.235337 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.235465 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.235587 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.244781 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/533741c8-f72a-4834-ad02-d33fc939e529-hosts-file\") pod \"node-resolver-nsgpk\" (UID: \"533741c8-f72a-4834-ad02-d33fc939e529\") " pod="openshift-dns/node-resolver-nsgpk" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.244843 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.244873 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjff8\" (UniqueName: \"kubernetes.io/projected/533741c8-f72a-4834-ad02-d33fc939e529-kube-api-access-gjff8\") pod \"node-resolver-nsgpk\" (UID: \"533741c8-f72a-4834-ad02-d33fc939e529\") " pod="openshift-dns/node-resolver-nsgpk" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.244995 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.245026 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.245049 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.245067 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.245140 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.245148 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:31.24512547 +0000 UTC m=+29.551001754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.245161 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.245173 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.245221 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:31.245203743 +0000 UTC m=+29.551079947 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.245823 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.255491 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.265072 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.265036 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.280930 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.295008 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.307308 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.324930 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.327902 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.336676 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.346049 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjff8\" (UniqueName: \"kubernetes.io/projected/533741c8-f72a-4834-ad02-d33fc939e529-kube-api-access-gjff8\") pod \"node-resolver-nsgpk\" (UID: \"533741c8-f72a-4834-ad02-d33fc939e529\") " pod="openshift-dns/node-resolver-nsgpk" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.346140 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80d87332-eaea-4007-a03e-a9a0f744563a-host\") pod \"node-ca-bllbs\" (UID: \"80d87332-eaea-4007-a03e-a9a0f744563a\") " pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.346191 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/533741c8-f72a-4834-ad02-d33fc939e529-hosts-file\") pod \"node-resolver-nsgpk\" (UID: \"533741c8-f72a-4834-ad02-d33fc939e529\") " pod="openshift-dns/node-resolver-nsgpk" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.346223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/80d87332-eaea-4007-a03e-a9a0f744563a-serviceca\") pod \"node-ca-bllbs\" (UID: \"80d87332-eaea-4007-a03e-a9a0f744563a\") " pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.346244 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcdpz\" (UniqueName: \"kubernetes.io/projected/80d87332-eaea-4007-a03e-a9a0f744563a-kube-api-access-lcdpz\") pod \"node-ca-bllbs\" (UID: \"80d87332-eaea-4007-a03e-a9a0f744563a\") " pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.346369 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/533741c8-f72a-4834-ad02-d33fc939e529-hosts-file\") pod \"node-resolver-nsgpk\" (UID: \"533741c8-f72a-4834-ad02-d33fc939e529\") " pod="openshift-dns/node-resolver-nsgpk" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.350151 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.360857 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.361481 4732 scope.go:117] "RemoveContainer" containerID="c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e" Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.361629 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.366576 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjff8\" (UniqueName: \"kubernetes.io/projected/533741c8-f72a-4834-ad02-d33fc939e529-kube-api-access-gjff8\") pod \"node-resolver-nsgpk\" (UID: \"533741c8-f72a-4834-ad02-d33fc939e529\") " pod="openshift-dns/node-resolver-nsgpk" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.366651 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.379041 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.390263 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.404176 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.420370 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.433213 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.446217 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nsgpk" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.447291 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcdpz\" (UniqueName: \"kubernetes.io/projected/80d87332-eaea-4007-a03e-a9a0f744563a-kube-api-access-lcdpz\") pod \"node-ca-bllbs\" (UID: \"80d87332-eaea-4007-a03e-a9a0f744563a\") " pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.447338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/80d87332-eaea-4007-a03e-a9a0f744563a-serviceca\") pod \"node-ca-bllbs\" (UID: \"80d87332-eaea-4007-a03e-a9a0f744563a\") " pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.447362 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80d87332-eaea-4007-a03e-a9a0f744563a-host\") pod \"node-ca-bllbs\" (UID: \"80d87332-eaea-4007-a03e-a9a0f744563a\") " pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.447433 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/80d87332-eaea-4007-a03e-a9a0f744563a-host\") pod \"node-ca-bllbs\" (UID: \"80d87332-eaea-4007-a03e-a9a0f744563a\") " pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.448512 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/80d87332-eaea-4007-a03e-a9a0f744563a-serviceca\") pod \"node-ca-bllbs\" (UID: \"80d87332-eaea-4007-a03e-a9a0f744563a\") " pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.452942 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: W0131 09:01:27.462560 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533741c8_f72a_4834_ad02_d33fc939e529.slice/crio-1523f9870b55ac1750601764c91c46fa77b171d3affc29d5f0a1d7cc19c30178 WatchSource:0}: Error finding container 1523f9870b55ac1750601764c91c46fa77b171d3affc29d5f0a1d7cc19c30178: Status 404 returned error can't find the container with id 1523f9870b55ac1750601764c91c46fa77b171d3affc29d5f0a1d7cc19c30178 Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.470221 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcdpz\" (UniqueName: \"kubernetes.io/projected/80d87332-eaea-4007-a03e-a9a0f744563a-kube-api-access-lcdpz\") pod \"node-ca-bllbs\" (UID: \"80d87332-eaea-4007-a03e-a9a0f744563a\") " pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.485626 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.496803 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 09:55:24.356252728 +0000 UTC Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.542664 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.542719 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.542790 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.542887 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.542962 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:27 crc kubenswrapper[4732]: E0131 09:01:27.543005 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.545859 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bllbs" Jan 31 09:01:27 crc kubenswrapper[4732]: W0131 09:01:27.565168 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80d87332_eaea_4007_a03e_a9a0f744563a.slice/crio-2e7f32ec790882c6cb1e4412a70561de6f9e5d6d4d4434f6b497ce5cebd7b75c WatchSource:0}: Error finding container 2e7f32ec790882c6cb1e4412a70561de6f9e5d6d4d4434f6b497ce5cebd7b75c: Status 404 returned error can't find the container with id 2e7f32ec790882c6cb1e4412a70561de6f9e5d6d4d4434f6b497ce5cebd7b75c Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.616914 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-jnbt8"] Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.617276 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.619969 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.620194 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.620331 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.620377 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.620409 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.634148 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.649004 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2bw2\" (UniqueName: \"kubernetes.io/projected/7d790207-d357-4b47-87bf-5b505e061820-kube-api-access-h2bw2\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.649071 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7d790207-d357-4b47-87bf-5b505e061820-mcd-auth-proxy-config\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.649152 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d790207-d357-4b47-87bf-5b505e061820-proxy-tls\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.649343 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7d790207-d357-4b47-87bf-5b505e061820-rootfs\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.652647 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.670480 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.684725 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.703266 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.706896 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nsgpk" event={"ID":"533741c8-f72a-4834-ad02-d33fc939e529","Type":"ContainerStarted","Data":"1523f9870b55ac1750601764c91c46fa77b171d3affc29d5f0a1d7cc19c30178"} Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.708036 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bllbs" event={"ID":"80d87332-eaea-4007-a03e-a9a0f744563a","Type":"ContainerStarted","Data":"2e7f32ec790882c6cb1e4412a70561de6f9e5d6d4d4434f6b497ce5cebd7b75c"} Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.718038 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.729792 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.747860 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.750481 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7d790207-d357-4b47-87bf-5b505e061820-rootfs\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.750523 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7d790207-d357-4b47-87bf-5b505e061820-mcd-auth-proxy-config\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.750538 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2bw2\" (UniqueName: \"kubernetes.io/projected/7d790207-d357-4b47-87bf-5b505e061820-kube-api-access-h2bw2\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.750579 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d790207-d357-4b47-87bf-5b505e061820-proxy-tls\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.751177 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7d790207-d357-4b47-87bf-5b505e061820-rootfs\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.751690 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7d790207-d357-4b47-87bf-5b505e061820-mcd-auth-proxy-config\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.753420 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d790207-d357-4b47-87bf-5b505e061820-proxy-tls\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.770789 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.780322 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.795215 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2bw2\" (UniqueName: \"kubernetes.io/projected/7d790207-d357-4b47-87bf-5b505e061820-kube-api-access-h2bw2\") pod \"machine-config-daemon-jnbt8\" (UID: \"7d790207-d357-4b47-87bf-5b505e061820\") " pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.798634 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.811895 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:27 crc kubenswrapper[4732]: I0131 09:01:27.931981 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:01:27 crc kubenswrapper[4732]: W0131 09:01:27.941215 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d790207_d357_4b47_87bf_5b505e061820.slice/crio-a0c0153293b92cb9e29eb17f363783d949abef5081ed954f8b22b5031597406c WatchSource:0}: Error finding container a0c0153293b92cb9e29eb17f363783d949abef5081ed954f8b22b5031597406c: Status 404 returned error can't find the container with id a0c0153293b92cb9e29eb17f363783d949abef5081ed954f8b22b5031597406c Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.024334 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-4mxsr"] Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.024674 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-t9kqf"] Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.024955 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.025793 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.026900 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.027336 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.027363 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.027527 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.030006 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.030336 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.030393 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.047926 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053478 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-var-lib-kubelet\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053523 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-conf-dir\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053545 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-run-multus-certs\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053566 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053596 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-socket-dir-parent\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053617 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-etc-kubernetes\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053637 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053661 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-run-netns\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053827 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-var-lib-cni-bin\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053877 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-os-release\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.053943 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-system-cni-dir\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054165 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-cni-dir\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-cnibin\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054220 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-os-release\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054260 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-var-lib-cni-multus\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054288 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-hostroot\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054317 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhxt6\" (UniqueName: \"kubernetes.io/projected/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-kube-api-access-jhxt6\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054360 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-daemon-config\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054390 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwsnx\" (UniqueName: \"kubernetes.io/projected/8e23192f-14db-41ef-af89-4a76e325d9c1-kube-api-access-fwsnx\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054415 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-system-cni-dir\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054439 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-cnibin\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054566 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054718 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e23192f-14db-41ef-af89-4a76e325d9c1-cni-binary-copy\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.054765 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-run-k8s-cni-cncf-io\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.063369 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.076378 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.089961 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.102693 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.104846 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-31 08:56:27 +0000 UTC, rotation deadline is 2026-11-25 18:04:42.823011329 +0000 UTC Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.104911 4732 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7161h3m14.718103766s for next certificate rotation Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.120251 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.145738 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156490 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-cni-dir\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156543 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-cnibin\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156567 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-os-release\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156591 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-var-lib-cni-multus\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156614 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-hostroot\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156636 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhxt6\" (UniqueName: \"kubernetes.io/projected/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-kube-api-access-jhxt6\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156677 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-daemon-config\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156718 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwsnx\" (UniqueName: \"kubernetes.io/projected/8e23192f-14db-41ef-af89-4a76e325d9c1-kube-api-access-fwsnx\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156745 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-system-cni-dir\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156779 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-cnibin\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156802 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.156824 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e23192f-14db-41ef-af89-4a76e325d9c1-cni-binary-copy\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157029 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-run-k8s-cni-cncf-io\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157065 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-var-lib-kubelet\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157087 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-conf-dir\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157108 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-run-multus-certs\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157131 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157160 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-socket-dir-parent\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157180 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-etc-kubernetes\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157202 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-run-netns\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157222 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-var-lib-cni-bin\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-os-release\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157264 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157293 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-system-cni-dir\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157487 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-system-cni-dir\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157781 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-cni-dir\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157834 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-run-k8s-cni-cncf-io\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157874 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-cnibin\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157918 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-var-lib-cni-multus\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157963 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-etc-kubernetes\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158001 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-run-netns\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.157995 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-socket-dir-parent\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158032 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-var-lib-cni-bin\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158105 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-var-lib-kubelet\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158152 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-conf-dir\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158194 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-host-run-multus-certs\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158214 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-os-release\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158264 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e23192f-14db-41ef-af89-4a76e325d9c1-hostroot\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158270 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-os-release\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158341 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-cnibin\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158371 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-system-cni-dir\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158854 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.158854 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e23192f-14db-41ef-af89-4a76e325d9c1-multus-daemon-config\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.159137 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.159160 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-cni-binary-copy\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.159397 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e23192f-14db-41ef-af89-4a76e325d9c1-cni-binary-copy\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.163413 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.178464 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwsnx\" (UniqueName: \"kubernetes.io/projected/8e23192f-14db-41ef-af89-4a76e325d9c1-kube-api-access-fwsnx\") pod \"multus-4mxsr\" (UID: \"8e23192f-14db-41ef-af89-4a76e325d9c1\") " pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.179729 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhxt6\" (UniqueName: \"kubernetes.io/projected/e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6-kube-api-access-jhxt6\") pod \"multus-additional-cni-plugins-t9kqf\" (UID: \"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\") " pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.180158 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.194034 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.206568 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.220620 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.233945 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.252726 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.269143 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.282550 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.300839 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.315443 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.331935 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.338083 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4mxsr" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.344026 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" Jan 31 09:01:28 crc kubenswrapper[4732]: W0131 09:01:28.355168 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e23192f_14db_41ef_af89_4a76e325d9c1.slice/crio-3bd339dc0405868e024e6906e67c986248a9fad29768487cedd694d3debc9b7b WatchSource:0}: Error finding container 3bd339dc0405868e024e6906e67c986248a9fad29768487cedd694d3debc9b7b: Status 404 returned error can't find the container with id 3bd339dc0405868e024e6906e67c986248a9fad29768487cedd694d3debc9b7b Jan 31 09:01:28 crc kubenswrapper[4732]: W0131 09:01:28.365302 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e6e0f4_2302_447f_a5e0_7db3d7b73cb6.slice/crio-4fd747eef746ab1ce009e8b557e70bc13c1a62dcdbe707f672f388417fa7df31 WatchSource:0}: Error finding container 4fd747eef746ab1ce009e8b557e70bc13c1a62dcdbe707f672f388417fa7df31: Status 404 returned error can't find the container with id 4fd747eef746ab1ce009e8b557e70bc13c1a62dcdbe707f672f388417fa7df31 Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.385816 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.402329 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8mtkt"] Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.403270 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: W0131 09:01:28.415165 4732 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Jan 31 09:01:28 crc kubenswrapper[4732]: E0131 09:01:28.415219 4732 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.415426 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.415579 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.415666 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.415869 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.415911 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.416151 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.417603 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.431459 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.448197 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.460827 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-var-lib-openvswitch\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.460895 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-ovn-kubernetes\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.460942 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-systemd-units\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.460963 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-script-lib\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.460984 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461011 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-netns\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461028 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-slash\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461058 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-ovn\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461078 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-env-overrides\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461098 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-systemd\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461116 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-node-log\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461143 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-log-socket\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461165 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-config\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461185 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-kubelet\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461207 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-etc-openvswitch\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461229 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-bin\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461259 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-netd\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461280 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jktvz\" (UniqueName: \"kubernetes.io/projected/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-kube-api-access-jktvz\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461300 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovn-node-metrics-cert\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.461324 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-openvswitch\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.462450 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.482789 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.497066 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 08:13:55.832653555 +0000 UTC Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.499115 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.511350 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.530867 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.547434 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562683 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-netns\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562739 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-slash\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562784 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-ovn\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562807 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-env-overrides\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562825 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-node-log\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562844 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-systemd\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562865 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-log-socket\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562886 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-config\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562907 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-kubelet\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-etc-openvswitch\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562949 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-bin\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562971 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-netd\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.562987 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jktvz\" (UniqueName: \"kubernetes.io/projected/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-kube-api-access-jktvz\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563010 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-openvswitch\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563027 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovn-node-metrics-cert\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563052 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-var-lib-openvswitch\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563069 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-ovn-kubernetes\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563064 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-kubelet\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563105 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-systemd-units\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563132 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-script-lib\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563152 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563175 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-netns\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563008 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-node-log\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563225 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563233 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-etc-openvswitch\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563267 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-var-lib-openvswitch\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563278 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-bin\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563296 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-ovn-kubernetes\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563321 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-netd\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563069 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-log-socket\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563362 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-systemd-units\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563108 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-systemd\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563692 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-openvswitch\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563683 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-env-overrides\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.563802 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-slash\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.564135 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-config\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.564579 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-ovn\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.564950 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-script-lib\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.569181 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovn-node-metrics-cert\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.581431 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.584665 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jktvz\" (UniqueName: \"kubernetes.io/projected/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-kube-api-access-jktvz\") pod \"ovnkube-node-8mtkt\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.598815 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.613695 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.627447 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.638097 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.647905 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.666368 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.681424 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.693874 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.708667 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.712720 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bllbs" event={"ID":"80d87332-eaea-4007-a03e-a9a0f744563a","Type":"ContainerStarted","Data":"a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0"} Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.714301 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" event={"ID":"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6","Type":"ContainerStarted","Data":"0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014"} Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.714334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" event={"ID":"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6","Type":"ContainerStarted","Data":"4fd747eef746ab1ce009e8b557e70bc13c1a62dcdbe707f672f388417fa7df31"} Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.716254 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nsgpk" event={"ID":"533741c8-f72a-4834-ad02-d33fc939e529","Type":"ContainerStarted","Data":"ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569"} Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.717656 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mxsr" event={"ID":"8e23192f-14db-41ef-af89-4a76e325d9c1","Type":"ContainerStarted","Data":"e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56"} Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.717692 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mxsr" event={"ID":"8e23192f-14db-41ef-af89-4a76e325d9c1","Type":"ContainerStarted","Data":"3bd339dc0405868e024e6906e67c986248a9fad29768487cedd694d3debc9b7b"} Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.719871 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7"} Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.720004 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5"} Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.720106 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"a0c0153293b92cb9e29eb17f363783d949abef5081ed954f8b22b5031597406c"} Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.728316 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.740580 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.751684 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.768579 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.825220 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.840706 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.881320 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.927818 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:28 crc kubenswrapper[4732]: I0131 09:01:28.967226 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:28Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.005454 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.046526 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.087510 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.126342 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.163732 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.207137 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.252126 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.288432 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.324142 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.381474 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.386024 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.498270 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:51:44.967293426 +0000 UTC Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.542691 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:29 crc kubenswrapper[4732]: E0131 09:01:29.542858 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.543033 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:29 crc kubenswrapper[4732]: E0131 09:01:29.543208 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.543374 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:29 crc kubenswrapper[4732]: E0131 09:01:29.543451 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.725380 4732 generic.go:334] "Generic (PLEG): container finished" podID="e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6" containerID="0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014" exitCode=0 Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.725478 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" event={"ID":"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6","Type":"ContainerDied","Data":"0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014"} Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.727334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"8ed5be886bc7763adb1d7a0a054a6dd73cde6a707faa32148f1f5ddc889335e4"} Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.746789 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.766910 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.768433 4732 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.770836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.770879 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.770894 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.771112 4732 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.781673 4732 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.782057 4732 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.783271 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.783305 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.783318 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.783341 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.783357 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:29Z","lastTransitionTime":"2026-01-31T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.789980 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: E0131 09:01:29.805413 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.810280 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.810323 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.810338 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.810358 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.810373 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:29Z","lastTransitionTime":"2026-01-31T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.814040 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: E0131 09:01:29.823781 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.828818 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.830463 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.830522 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.830532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.830554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.830567 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:29Z","lastTransitionTime":"2026-01-31T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:29 crc kubenswrapper[4732]: E0131 09:01:29.844715 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.846824 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.849074 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.849122 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.849138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.849161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.849172 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:29Z","lastTransitionTime":"2026-01-31T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.863175 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: E0131 09:01:29.864072 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.867408 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.867447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.867455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.867470 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.867480 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:29Z","lastTransitionTime":"2026-01-31T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.878795 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: E0131 09:01:29.882312 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: E0131 09:01:29.882477 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.884433 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.884468 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.884477 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.884493 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.884507 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:29Z","lastTransitionTime":"2026-01-31T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.891856 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.904493 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.925015 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.944142 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.957434 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.973036 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.988762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.988810 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.988822 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.988842 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.988859 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:29Z","lastTransitionTime":"2026-01-31T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:29 crc kubenswrapper[4732]: I0131 09:01:29.993878 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.091594 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.091941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.091955 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.091972 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.091982 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:30Z","lastTransitionTime":"2026-01-31T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.194145 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.194186 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.194197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.194213 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.194223 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:30Z","lastTransitionTime":"2026-01-31T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.297118 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.297167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.297177 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.297195 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.297207 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:30Z","lastTransitionTime":"2026-01-31T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.399920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.400261 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.400271 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.400287 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.400296 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:30Z","lastTransitionTime":"2026-01-31T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.499312 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 15:47:34.673103183 +0000 UTC Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.502847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.502895 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.502912 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.502934 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.502948 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:30Z","lastTransitionTime":"2026-01-31T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.604839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.604889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.604900 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.604917 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.604931 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:30Z","lastTransitionTime":"2026-01-31T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.707825 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.707870 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.707880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.707896 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.707907 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:30Z","lastTransitionTime":"2026-01-31T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.733188 4732 generic.go:334] "Generic (PLEG): container finished" podID="e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6" containerID="c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688" exitCode=0 Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.733274 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" event={"ID":"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6","Type":"ContainerDied","Data":"c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.737378 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193" exitCode=0 Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.737428 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.756538 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.773065 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.785940 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.802213 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.810799 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.811052 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.811100 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.811125 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.811155 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:30Z","lastTransitionTime":"2026-01-31T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.817136 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.833787 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.847603 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.873041 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.897152 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.918087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.918137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.918151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.918171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.918184 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:30Z","lastTransitionTime":"2026-01-31T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.928623 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.944959 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.959974 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.981009 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:30 crc kubenswrapper[4732]: I0131 09:01:30.996714 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:30Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.018151 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.020136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.020171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.020183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.020202 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.020214 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.038792 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.058006 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.070354 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.084269 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.099278 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.110495 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.120919 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.122925 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.122971 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.122984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.123002 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.123014 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.135478 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.156810 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.170323 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.184956 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.201094 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.201263 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.201372 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.201415 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.201491 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.201493 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:01:39.201460931 +0000 UTC m=+37.507337135 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.201558 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.201594 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:39.201564624 +0000 UTC m=+37.507440828 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.201618 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:39.201609766 +0000 UTC m=+37.507486040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.219251 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.224859 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.224896 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.224906 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.224922 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.224931 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.235979 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.246534 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.302961 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.303017 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.303179 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.303200 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.303215 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.303211 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.303256 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.303268 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.303277 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:39.303254913 +0000 UTC m=+37.609131117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.303329 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:39.303309545 +0000 UTC m=+37.609185749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.326928 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.326963 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.326973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.326988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.326997 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.430030 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.430074 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.430086 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.430104 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.430116 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.500372 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 21:36:11.116790955 +0000 UTC Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.533521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.533555 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.533563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.533582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.533592 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.541806 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.541857 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.541806 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.541961 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.542017 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:31 crc kubenswrapper[4732]: E0131 09:01:31.542100 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.637106 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.637136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.637146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.637161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.637171 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.743682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.743747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.743756 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.743773 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.743784 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.747257 4732 generic.go:334] "Generic (PLEG): container finished" podID="e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6" containerID="85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58" exitCode=0 Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.747337 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" event={"ID":"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6","Type":"ContainerDied","Data":"85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.752534 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.752596 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.752611 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.752625 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.752636 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.752648 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.768448 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.797639 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.814336 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.826998 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.844512 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.846350 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.846392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.846402 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.846417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.846426 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.855924 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.865841 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.877282 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.902879 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.917640 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.928620 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.941910 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.948372 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.948419 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.948429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.948455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.948470 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:31Z","lastTransitionTime":"2026-01-31T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.957977 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.969671 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:31 crc kubenswrapper[4732]: I0131 09:01:31.979792 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:31Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.051855 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.051892 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.051902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.051921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.051933 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.153779 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.153820 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.153832 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.153851 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.153863 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.255888 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.255960 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.255972 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.255992 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.256001 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.268376 4732 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.358925 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.358973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.358990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.359013 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.359029 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.462330 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.462386 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.462399 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.462420 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.462435 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.501004 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 17:46:24.768149534 +0000 UTC Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.557632 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.570185 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.570226 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.570238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.570257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.570269 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.583776 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.602631 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.617674 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.630839 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.651250 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.664865 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.672219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.672244 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.672252 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.672267 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.672276 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.677163 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.694452 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.713408 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.730429 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.748772 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.758933 4732 generic.go:334] "Generic (PLEG): container finished" podID="e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6" containerID="111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0" exitCode=0 Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.758986 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" event={"ID":"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6","Type":"ContainerDied","Data":"111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.769149 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.774442 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.774488 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.774499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.774514 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.774524 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.783968 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.799257 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.813687 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.826504 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.841339 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.854272 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.865500 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.876467 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.876492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.876500 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.876514 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.876523 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.889206 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.907433 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.919310 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.932495 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.944461 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.955818 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.975912 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.981673 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.981724 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.981734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.981753 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.981765 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:32Z","lastTransitionTime":"2026-01-31T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:32 crc kubenswrapper[4732]: I0131 09:01:32.997762 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:32Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.019598 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.056124 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.085057 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.085114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.085127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.085152 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.085169 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:33Z","lastTransitionTime":"2026-01-31T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.188062 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.188122 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.188135 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.188155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.188168 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:33Z","lastTransitionTime":"2026-01-31T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.290990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.291040 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.291052 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.291073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.291087 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:33Z","lastTransitionTime":"2026-01-31T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.393582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.393636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.393647 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.393714 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.393727 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:33Z","lastTransitionTime":"2026-01-31T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.496051 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.496089 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.496098 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.496113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.496123 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:33Z","lastTransitionTime":"2026-01-31T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.501473 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 13:26:06.647242429 +0000 UTC Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.542833 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.542866 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.542930 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:33 crc kubenswrapper[4732]: E0131 09:01:33.543504 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:33 crc kubenswrapper[4732]: E0131 09:01:33.543655 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:33 crc kubenswrapper[4732]: E0131 09:01:33.543902 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.599701 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.599757 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.599769 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.599790 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.599804 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:33Z","lastTransitionTime":"2026-01-31T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.703108 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.703201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.703225 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.703254 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.703276 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:33Z","lastTransitionTime":"2026-01-31T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.767085 4732 generic.go:334] "Generic (PLEG): container finished" podID="e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6" containerID="2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b" exitCode=0 Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.767167 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" event={"ID":"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6","Type":"ContainerDied","Data":"2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.772791 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.784028 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.802438 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.806719 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.806767 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.806777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.806798 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.806809 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:33Z","lastTransitionTime":"2026-01-31T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.819266 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.834558 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.845812 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.858897 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.887667 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.907667 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.914161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.914205 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.914215 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.914235 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.914253 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:33Z","lastTransitionTime":"2026-01-31T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.922888 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.938444 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.955975 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.972248 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:33 crc kubenswrapper[4732]: I0131 09:01:33.984442 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:33Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.004841 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.017209 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.017249 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.017257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.017273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.017282 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.020012 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.119494 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.119532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.119543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.119562 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.119573 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.222634 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.222690 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.222701 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.222719 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.222732 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.325710 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.325733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.325742 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.325755 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.325764 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.428638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.428697 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.428709 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.428732 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.428745 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.501877 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 14:23:24.901151788 +0000 UTC Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.530944 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.531023 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.531061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.531085 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.531107 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.634300 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.634340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.634349 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.634365 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.634376 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.737569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.737614 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.737625 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.737645 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.737657 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.780865 4732 generic.go:334] "Generic (PLEG): container finished" podID="e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6" containerID="4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168" exitCode=0 Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.780922 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" event={"ID":"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6","Type":"ContainerDied","Data":"4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.797175 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.847582 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.847924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.847979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.848000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.848019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.848038 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.857344 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.867372 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.886199 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.901032 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.915342 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.927534 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.940532 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.950899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.950933 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.951220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.951246 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.951259 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:34Z","lastTransitionTime":"2026-01-31T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.952022 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.964973 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.979270 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:34 crc kubenswrapper[4732]: I0131 09:01:34.996269 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:34Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.010831 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.024031 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.053792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.053837 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.053850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.053877 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.053905 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.157166 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.157228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.157246 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.157275 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.157295 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.260847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.260908 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.260926 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.260960 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.260977 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.363294 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.363355 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.363380 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.363404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.363420 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.466247 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.466293 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.466302 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.466318 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.466327 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.502536 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 09:48:32.899578693 +0000 UTC Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.542091 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.542172 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.542274 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:35 crc kubenswrapper[4732]: E0131 09:01:35.542304 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:35 crc kubenswrapper[4732]: E0131 09:01:35.542471 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:35 crc kubenswrapper[4732]: E0131 09:01:35.542737 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.572128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.572190 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.572208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.572236 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.572254 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.674794 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.674858 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.674876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.674902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.674920 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.778166 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.778222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.778233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.778248 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.778257 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.794346 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" event={"ID":"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6","Type":"ContainerStarted","Data":"038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.813226 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.826179 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.843465 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.866822 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.880761 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.880802 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.880819 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.880838 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.880850 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.882560 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.897085 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.912975 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.929535 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.942939 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.954212 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.973267 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.984078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.984119 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.984127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.984143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.984153 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:35Z","lastTransitionTime":"2026-01-31T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:35 crc kubenswrapper[4732]: I0131 09:01:35.989344 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.001387 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:35Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.011989 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.026963 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.087582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.087632 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.087642 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.087663 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.087697 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:36Z","lastTransitionTime":"2026-01-31T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.190532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.190589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.190609 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.190634 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.190650 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:36Z","lastTransitionTime":"2026-01-31T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.293573 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.293620 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.293631 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.293652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.293683 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:36Z","lastTransitionTime":"2026-01-31T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.396659 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.396774 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.396793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.396819 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.396839 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:36Z","lastTransitionTime":"2026-01-31T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.500598 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.500633 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.500641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.500657 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.500669 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:36Z","lastTransitionTime":"2026-01-31T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.503007 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 11:58:03.978940705 +0000 UTC Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.603117 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.603197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.603211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.603227 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.603237 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:36Z","lastTransitionTime":"2026-01-31T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.706639 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.706701 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.706715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.706736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.706747 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:36Z","lastTransitionTime":"2026-01-31T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.805324 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.805907 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.806133 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.809798 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.809849 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.809865 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.809889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.809908 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:36Z","lastTransitionTime":"2026-01-31T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.818827 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.829742 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.833689 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.838234 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.839525 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.849038 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.865493 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.880780 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.894119 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.906646 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.912387 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.912451 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.912463 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.912484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.912501 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:36Z","lastTransitionTime":"2026-01-31T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.924205 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.935528 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.948124 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.960665 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.976574 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:36 crc kubenswrapper[4732]: I0131 09:01:36.995075 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:36Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.006053 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.015081 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.015114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.015125 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.015142 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.015155 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.018901 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.030272 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.041391 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.052060 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.061117 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.072144 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.090701 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.104965 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.116101 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.117990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.118027 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.118037 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.118056 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.118068 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.126885 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.135289 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.151466 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.162512 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.173039 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.182357 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:37Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.220499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.220534 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.220546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.220566 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.220577 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.323231 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.323270 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.323282 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.323298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.323309 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.427022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.427070 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.427082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.427101 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.427113 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.503475 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 02:45:28.453289623 +0000 UTC Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.530764 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.530827 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.530854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.530885 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.530908 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.542281 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.542297 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.542298 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:37 crc kubenswrapper[4732]: E0131 09:01:37.542413 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:37 crc kubenswrapper[4732]: E0131 09:01:37.542620 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:37 crc kubenswrapper[4732]: E0131 09:01:37.542794 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.634107 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.634193 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.634208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.634233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.634250 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.737508 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.737557 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.737568 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.737587 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.737597 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.809941 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.840587 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.840761 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.840780 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.840809 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.840830 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.943368 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.943443 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.943468 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.943500 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:37 crc kubenswrapper[4732]: I0131 09:01:37.943525 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:37Z","lastTransitionTime":"2026-01-31T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.046226 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.046272 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.046293 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.046312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.046324 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.157602 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.157734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.157766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.157801 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.157838 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.261174 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.261222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.261233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.261253 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.261264 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.363350 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.363385 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.363393 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.363408 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.363417 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.465511 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.465548 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.465558 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.465575 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.465588 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.504223 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 17:12:05.250027298 +0000 UTC Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.570766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.570811 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.570823 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.570843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.570856 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.674183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.674222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.674234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.674250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.674260 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.777159 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.777208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.777221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.777239 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.777252 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.812381 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.880179 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.880226 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.880238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.880257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.880268 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.983289 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.983332 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.983341 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.983356 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:38 crc kubenswrapper[4732]: I0131 09:01:38.983365 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:38Z","lastTransitionTime":"2026-01-31T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.085725 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.085769 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.085782 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.085800 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.085812 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:39Z","lastTransitionTime":"2026-01-31T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.189054 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.189103 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.189113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.189131 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.189151 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:39Z","lastTransitionTime":"2026-01-31T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.286211 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.286356 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:01:55.286327094 +0000 UTC m=+53.592203298 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.286387 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.286451 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.286567 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.286622 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.286631 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:55.286619834 +0000 UTC m=+53.592496038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.286841 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:55.286745588 +0000 UTC m=+53.592621812 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.292104 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.292188 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.292204 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.292232 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.292249 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:39Z","lastTransitionTime":"2026-01-31T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.387532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.387619 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.387853 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.387890 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.387905 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.387991 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:55.387966051 +0000 UTC m=+53.693842265 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.388043 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.388174 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.388211 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.388464 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:55.388377705 +0000 UTC m=+53.694254009 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.395231 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.395314 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.395327 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.395351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.395364 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:39Z","lastTransitionTime":"2026-01-31T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.498619 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.498692 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.498709 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.498734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.498750 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:39Z","lastTransitionTime":"2026-01-31T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.505208 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 16:24:26.892910947 +0000 UTC Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.542097 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.542143 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.542284 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.542316 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.542400 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:39 crc kubenswrapper[4732]: E0131 09:01:39.542827 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.543138 4732 scope.go:117] "RemoveContainer" containerID="c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.601733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.602112 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.602125 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.602144 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.602156 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:39Z","lastTransitionTime":"2026-01-31T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.704779 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.704825 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.704841 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.704866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.704887 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:39Z","lastTransitionTime":"2026-01-31T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.808378 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.808420 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.808430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.808446 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.808455 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:39Z","lastTransitionTime":"2026-01-31T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.910786 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.910832 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.910847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.910868 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:39 crc kubenswrapper[4732]: I0131 09:01:39.910882 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:39Z","lastTransitionTime":"2026-01-31T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.013562 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.013597 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.013609 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.013627 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.013641 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.116752 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.116787 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.116799 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.116818 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.116829 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.122163 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.122200 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.122220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.122237 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.122250 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: E0131 09:01:40.139221 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.143751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.143806 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.143822 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.143847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.143864 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: E0131 09:01:40.164419 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.173201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.173262 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.173274 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.173295 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.173306 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: E0131 09:01:40.185326 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.189189 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.189219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.189230 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.189247 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.189256 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: E0131 09:01:40.201713 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.205613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.205651 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.205680 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.205701 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.205712 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: E0131 09:01:40.217715 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: E0131 09:01:40.217872 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.219449 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.219488 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.219500 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.219518 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.219531 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.276608 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk"] Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.277079 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.280979 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.281726 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.292861 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.307205 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.322306 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.322351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.322365 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.322385 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.322401 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.323641 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.338612 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.353011 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.382734 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.398261 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phqth\" (UniqueName: \"kubernetes.io/projected/0313609d-3507-4db5-a190-9dbf59d73e6e-kube-api-access-phqth\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.398325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0313609d-3507-4db5-a190-9dbf59d73e6e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.398362 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0313609d-3507-4db5-a190-9dbf59d73e6e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.398414 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0313609d-3507-4db5-a190-9dbf59d73e6e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.401307 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.415363 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.425724 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.425749 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.425759 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.425774 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.425793 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.426697 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.436275 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.444092 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.461496 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.475640 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.487410 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.499361 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0313609d-3507-4db5-a190-9dbf59d73e6e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.499399 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phqth\" (UniqueName: \"kubernetes.io/projected/0313609d-3507-4db5-a190-9dbf59d73e6e-kube-api-access-phqth\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.499427 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0313609d-3507-4db5-a190-9dbf59d73e6e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.499465 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0313609d-3507-4db5-a190-9dbf59d73e6e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.499848 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.500062 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0313609d-3507-4db5-a190-9dbf59d73e6e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.500788 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0313609d-3507-4db5-a190-9dbf59d73e6e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.505520 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 22:10:21.579536349 +0000 UTC Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.513252 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0313609d-3507-4db5-a190-9dbf59d73e6e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.515809 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phqth\" (UniqueName: \"kubernetes.io/projected/0313609d-3507-4db5-a190-9dbf59d73e6e-kube-api-access-phqth\") pod \"ovnkube-control-plane-749d76644c-gchqk\" (UID: \"0313609d-3507-4db5-a190-9dbf59d73e6e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.528036 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.528064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.528073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.528088 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.528099 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.529150 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.593375 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" Jan 31 09:01:40 crc kubenswrapper[4732]: W0131 09:01:40.610346 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0313609d_3507_4db5_a190_9dbf59d73e6e.slice/crio-9691c933bac02369bc86130d8532c83da0687e2cedb632cfac84d2ae0a8ecdec WatchSource:0}: Error finding container 9691c933bac02369bc86130d8532c83da0687e2cedb632cfac84d2ae0a8ecdec: Status 404 returned error can't find the container with id 9691c933bac02369bc86130d8532c83da0687e2cedb632cfac84d2ae0a8ecdec Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.630436 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.630485 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.630496 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.630517 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.630538 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.733919 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.733954 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.733963 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.733999 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.734009 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.821334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" event={"ID":"0313609d-3507-4db5-a190-9dbf59d73e6e","Type":"ContainerStarted","Data":"9691c933bac02369bc86130d8532c83da0687e2cedb632cfac84d2ae0a8ecdec"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.823416 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/0.log" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.826800 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818" exitCode=1 Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.826864 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.827773 4732 scope.go:117] "RemoveContainer" containerID="4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.828947 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.832996 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.833567 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.841613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.841684 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.841701 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.841723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.841739 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.848233 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.867637 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.888054 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.906160 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.919524 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.934070 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.944378 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.944419 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.944429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.944450 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.944461 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:40Z","lastTransitionTime":"2026-01-31T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.946279 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.957920 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.975905 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.987006 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:40 crc kubenswrapper[4732]: I0131 09:01:40.997101 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:40Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.009830 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.022220 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.035111 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.045006 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.046521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.046560 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.046571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.046590 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.046602 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.058617 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.073871 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.086374 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.107504 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.139583 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.148729 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.148778 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.148792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.148818 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.148830 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.157632 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.172904 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.183949 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.196470 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.208168 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.229064 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.241797 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.251220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.251247 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.251258 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.251273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.251284 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.254328 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.266171 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.281006 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.290432 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.309137 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.353503 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.353784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.353889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.353971 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.354045 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.456589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.456654 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.456680 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.456698 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.456711 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.506002 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 19:43:38.562283942 +0000 UTC Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.541645 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.541688 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:41 crc kubenswrapper[4732]: E0131 09:01:41.541815 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.541844 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:41 crc kubenswrapper[4732]: E0131 09:01:41.542004 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:41 crc kubenswrapper[4732]: E0131 09:01:41.542156 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.559230 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.559259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.559270 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.559286 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.559298 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.662464 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.662521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.662532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.662555 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.662569 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.734449 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7fgvm"] Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.735795 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:41 crc kubenswrapper[4732]: E0131 09:01:41.735899 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.749083 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.765431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.765502 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.765523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.765553 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.765575 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.769823 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.786471 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.802613 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.815157 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.815231 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47jtm\" (UniqueName: \"kubernetes.io/projected/3bd29a31-1a47-40da-afc5-6c4423067083-kube-api-access-47jtm\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.826016 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.836727 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" event={"ID":"0313609d-3507-4db5-a190-9dbf59d73e6e","Type":"ContainerStarted","Data":"c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.836777 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" event={"ID":"0313609d-3507-4db5-a190-9dbf59d73e6e","Type":"ContainerStarted","Data":"b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.838235 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/0.log" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.841087 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.841205 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.853491 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.867189 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.868560 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.868592 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.868603 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.868623 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.868633 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.882430 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.896899 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.909965 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.915726 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47jtm\" (UniqueName: \"kubernetes.io/projected/3bd29a31-1a47-40da-afc5-6c4423067083-kube-api-access-47jtm\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.915813 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:41 crc kubenswrapper[4732]: E0131 09:01:41.915924 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:41 crc kubenswrapper[4732]: E0131 09:01:41.915969 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs podName:3bd29a31-1a47-40da-afc5-6c4423067083 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:42.415954913 +0000 UTC m=+40.721831117 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs") pod "network-metrics-daemon-7fgvm" (UID: "3bd29a31-1a47-40da-afc5-6c4423067083") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.922455 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.945095 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.945579 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47jtm\" (UniqueName: \"kubernetes.io/projected/3bd29a31-1a47-40da-afc5-6c4423067083-kube-api-access-47jtm\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.967369 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.971395 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.971438 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.971451 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.971469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.971481 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:41Z","lastTransitionTime":"2026-01-31T09:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.984181 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:41 crc kubenswrapper[4732]: I0131 09:01:41.997844 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:41Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.012749 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.023332 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.034216 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.045206 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.069839 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.075439 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.075492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.075505 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.075528 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.075545 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:42Z","lastTransitionTime":"2026-01-31T09:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.086243 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.099640 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.110354 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.126893 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.141093 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.152328 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.164758 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.177921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.177976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.177988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.178009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.178021 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:42Z","lastTransitionTime":"2026-01-31T09:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.186644 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.201972 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.219985 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.237926 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.256601 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.268953 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.280862 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.280925 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.280938 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.280958 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.280971 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:42Z","lastTransitionTime":"2026-01-31T09:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.285082 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.385107 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.385176 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.385198 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.385228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.385254 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:42Z","lastTransitionTime":"2026-01-31T09:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.420929 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:42 crc kubenswrapper[4732]: E0131 09:01:42.421137 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:42 crc kubenswrapper[4732]: E0131 09:01:42.421245 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs podName:3bd29a31-1a47-40da-afc5-6c4423067083 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:43.421220605 +0000 UTC m=+41.727096809 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs") pod "network-metrics-daemon-7fgvm" (UID: "3bd29a31-1a47-40da-afc5-6c4423067083") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.488134 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.488191 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.488209 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.488232 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.488248 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:42Z","lastTransitionTime":"2026-01-31T09:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.506798 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 15:52:12.41884144 +0000 UTC Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.558304 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.574307 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.590898 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.590940 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.590955 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.590979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.590994 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:42Z","lastTransitionTime":"2026-01-31T09:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.608274 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.622608 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.636254 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.651687 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.664175 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.677618 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.687648 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.697437 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.697521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.697638 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.697682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.697697 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:42Z","lastTransitionTime":"2026-01-31T09:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.701325 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.719828 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.732246 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.746400 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.762937 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.776073 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.785078 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.797251 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.799980 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.800024 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.800035 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.800054 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.800065 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:42Z","lastTransitionTime":"2026-01-31T09:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.844402 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/1.log" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.845109 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/0.log" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.847488 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a" exitCode=1 Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.847539 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.847819 4732 scope.go:117] "RemoveContainer" containerID="4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.848326 4732 scope.go:117] "RemoveContainer" containerID="9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a" Jan 31 09:01:42 crc kubenswrapper[4732]: E0131 09:01:42.848575 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.870155 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.888297 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.901284 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.903424 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.903480 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.903498 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.903524 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.903540 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:42Z","lastTransitionTime":"2026-01-31T09:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.914086 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.943062 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.958571 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.971756 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:42 crc kubenswrapper[4732]: I0131 09:01:42.986140 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:42Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.003388 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.005978 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.006022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.006033 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.006050 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.006062 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.015233 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.028586 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.041184 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.053180 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.063743 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.075130 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.094748 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.106682 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.108197 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.108224 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.108238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.108257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.108269 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.127087 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"message\\\":\\\"1 09:01:42.390171 6216 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390285 6216 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390357 6216 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390398 6216 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390856 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:42.390942 6216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:42.391006 6216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:42.391068 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 09:01:42.391132 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 09:01:42.391188 6216 factory.go:656] Stopping watch factory\\\\nI0131 09:01:42.391259 6216 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:01:42.390987 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:42.391113 6216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 09:01:42.391233 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.141627 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.155488 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.170319 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.207286 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.211097 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.211154 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.211168 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.211186 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.211198 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.251876 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.293680 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.313446 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.313494 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.313504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.313523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.313536 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.329086 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.370477 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.413853 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.416000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.416036 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.416047 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.416065 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.416078 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.431129 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:43 crc kubenswrapper[4732]: E0131 09:01:43.431334 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:43 crc kubenswrapper[4732]: E0131 09:01:43.431409 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs podName:3bd29a31-1a47-40da-afc5-6c4423067083 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:45.431386396 +0000 UTC m=+43.737262620 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs") pod "network-metrics-daemon-7fgvm" (UID: "3bd29a31-1a47-40da-afc5-6c4423067083") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.454383 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.488498 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.507456 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 00:45:33.891858799 +0000 UTC Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.519458 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.519539 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.519548 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.519565 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.519575 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.533837 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.541868 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.541936 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.542017 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:43 crc kubenswrapper[4732]: E0131 09:01:43.542248 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.542306 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:43 crc kubenswrapper[4732]: E0131 09:01:43.542554 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:43 crc kubenswrapper[4732]: E0131 09:01:43.542697 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:43 crc kubenswrapper[4732]: E0131 09:01:43.542822 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.570981 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.606800 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.622104 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.622159 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.622171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.622190 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.622203 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.649703 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.703507 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:43Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.724364 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.724400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.724412 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.724431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.724442 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.826934 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.827020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.827042 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.827071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.827091 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.853731 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/1.log" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.930378 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.930695 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.930729 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.930761 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:43 crc kubenswrapper[4732]: I0131 09:01:43.930786 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:43Z","lastTransitionTime":"2026-01-31T09:01:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.033715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.033784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.033798 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.033816 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.033829 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.136312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.136351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.136361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.136379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.136388 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.239315 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.239393 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.239410 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.239436 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.239454 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.342261 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.342294 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.342303 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.342319 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.342327 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.444790 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.444864 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.444882 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.444901 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.444915 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.508057 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 02:02:34.262369231 +0000 UTC Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.548764 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.548833 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.548843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.548863 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.548880 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.653299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.653357 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.653372 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.653392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.653405 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.756110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.756151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.756163 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.756179 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.756191 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.859093 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.859138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.859149 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.859167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.859178 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.962228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.962279 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.962294 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.962316 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:44 crc kubenswrapper[4732]: I0131 09:01:44.962329 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:44Z","lastTransitionTime":"2026-01-31T09:01:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.065913 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.065981 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.066005 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.066036 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.066057 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.169984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.170038 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.170057 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.170083 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.170101 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.272994 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.273330 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.273340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.273356 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.273365 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.376251 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.376298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.376309 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.376325 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.376334 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.477582 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:45 crc kubenswrapper[4732]: E0131 09:01:45.477834 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:45 crc kubenswrapper[4732]: E0131 09:01:45.477911 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs podName:3bd29a31-1a47-40da-afc5-6c4423067083 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:49.477885687 +0000 UTC m=+47.783761931 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs") pod "network-metrics-daemon-7fgvm" (UID: "3bd29a31-1a47-40da-afc5-6c4423067083") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.480203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.480285 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.480303 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.480330 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.480349 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.508301 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 23:44:58.804525144 +0000 UTC Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.541746 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.541827 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.541783 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.541756 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:45 crc kubenswrapper[4732]: E0131 09:01:45.541935 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:45 crc kubenswrapper[4732]: E0131 09:01:45.542053 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:45 crc kubenswrapper[4732]: E0131 09:01:45.542154 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:45 crc kubenswrapper[4732]: E0131 09:01:45.542227 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.583153 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.583201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.583234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.583251 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.583286 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.686048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.686078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.686087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.686102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.686112 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.788640 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.788708 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.788724 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.788742 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.788754 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.891085 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.891125 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.891137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.891153 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.891166 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.993450 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.993488 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.993498 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.993513 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:45 crc kubenswrapper[4732]: I0131 09:01:45.993522 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:45Z","lastTransitionTime":"2026-01-31T09:01:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.096628 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.096702 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.096715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.096736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.096748 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:46Z","lastTransitionTime":"2026-01-31T09:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.204914 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.204958 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.204969 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.204987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.205000 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:46Z","lastTransitionTime":"2026-01-31T09:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.307800 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.307869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.307883 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.307907 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.307921 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:46Z","lastTransitionTime":"2026-01-31T09:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.411228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.411298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.411310 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.411336 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.411349 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:46Z","lastTransitionTime":"2026-01-31T09:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.509373 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 23:48:00.323788286 +0000 UTC Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.515012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.515076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.515098 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.515127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.515171 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:46Z","lastTransitionTime":"2026-01-31T09:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.618430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.618539 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.618556 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.618583 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.618600 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:46Z","lastTransitionTime":"2026-01-31T09:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.720418 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.720467 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.720482 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.720499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.720510 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:46Z","lastTransitionTime":"2026-01-31T09:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.823921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.824001 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.824023 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.824055 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.824082 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:46Z","lastTransitionTime":"2026-01-31T09:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.927052 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.927105 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.927116 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.927136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:46 crc kubenswrapper[4732]: I0131 09:01:46.927147 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:46Z","lastTransitionTime":"2026-01-31T09:01:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.030322 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.030397 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.030410 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.030428 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.030441 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.134006 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.134064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.134085 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.134109 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.134125 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.238224 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.238298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.238310 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.238334 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.238351 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.341949 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.341979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.341988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.342002 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.342011 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.445061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.445127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.445149 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.445179 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.445200 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.510541 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 21:37:00.768084678 +0000 UTC Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.542251 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.542299 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:47 crc kubenswrapper[4732]: E0131 09:01:47.542455 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.542486 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:47 crc kubenswrapper[4732]: E0131 09:01:47.542574 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.542257 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:47 crc kubenswrapper[4732]: E0131 09:01:47.542708 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:47 crc kubenswrapper[4732]: E0131 09:01:47.542792 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.548411 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.548489 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.548501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.548567 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.548583 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.651931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.651990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.652005 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.652026 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.652044 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.755070 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.755128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.755143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.755161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.755176 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.857130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.857171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.857182 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.857200 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.857212 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.960747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.960822 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.960840 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.960867 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:47 crc kubenswrapper[4732]: I0131 09:01:47.960884 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:47Z","lastTransitionTime":"2026-01-31T09:01:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.064096 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.064140 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.064150 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.064165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.064175 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.169138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.169201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.169219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.169243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.169261 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.272686 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.272743 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.272760 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.272784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.272803 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.378167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.378264 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.378279 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.378299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.378308 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.480234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.480277 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.480287 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.480304 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.480316 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.511022 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 13:50:05.388631879 +0000 UTC Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.584403 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.584444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.584456 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.584479 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.584490 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.687251 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.687300 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.687311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.687327 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.687338 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.790418 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.790465 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.790473 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.790492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.790506 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.893067 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.893122 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.893133 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.893156 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.893171 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.995702 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.995749 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.995765 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.995782 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:48 crc kubenswrapper[4732]: I0131 09:01:48.995792 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:48Z","lastTransitionTime":"2026-01-31T09:01:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.098854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.098926 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.098965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.098996 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.099016 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:49Z","lastTransitionTime":"2026-01-31T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.201361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.201404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.201415 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.201431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.201442 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:49Z","lastTransitionTime":"2026-01-31T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.304786 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.305020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.305032 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.305051 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.305100 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:49Z","lastTransitionTime":"2026-01-31T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.408075 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.408118 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.408129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.408146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.408157 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:49Z","lastTransitionTime":"2026-01-31T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.511111 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.511154 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.511165 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.511183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.511196 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:49Z","lastTransitionTime":"2026-01-31T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.511292 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 09:05:42.246368226 +0000 UTC Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.521231 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:49 crc kubenswrapper[4732]: E0131 09:01:49.521361 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:49 crc kubenswrapper[4732]: E0131 09:01:49.521414 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs podName:3bd29a31-1a47-40da-afc5-6c4423067083 nodeName:}" failed. No retries permitted until 2026-01-31 09:01:57.521400263 +0000 UTC m=+55.827276467 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs") pod "network-metrics-daemon-7fgvm" (UID: "3bd29a31-1a47-40da-afc5-6c4423067083") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.542085 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.542085 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:49 crc kubenswrapper[4732]: E0131 09:01:49.542323 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.542109 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:49 crc kubenswrapper[4732]: E0131 09:01:49.542423 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.542092 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:49 crc kubenswrapper[4732]: E0131 09:01:49.542235 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:49 crc kubenswrapper[4732]: E0131 09:01:49.542484 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.613833 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.613930 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.613946 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.613997 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.614012 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:49Z","lastTransitionTime":"2026-01-31T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.718645 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.718750 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.718767 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.718792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.718811 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:49Z","lastTransitionTime":"2026-01-31T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.822161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.822200 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.822211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.822228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.822240 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:49Z","lastTransitionTime":"2026-01-31T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.926365 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.926426 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.926445 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.926476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:49 crc kubenswrapper[4732]: I0131 09:01:49.926496 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:49Z","lastTransitionTime":"2026-01-31T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.029116 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.029217 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.029233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.029250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.029285 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.132262 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.132303 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.132316 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.132334 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.132350 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.236137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.236191 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.236202 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.236220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.236231 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.339154 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.339206 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.339217 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.339238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.339253 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.444126 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.444237 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.444257 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.445474 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.445550 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.483579 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.483650 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.483694 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.483715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.483727 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: E0131 09:01:50.496412 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:50Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.501043 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.501374 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.501689 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.501823 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.501985 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.511405 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 09:44:55.737386301 +0000 UTC Jan 31 09:01:50 crc kubenswrapper[4732]: E0131 09:01:50.515013 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:50Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.519026 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.519058 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.519069 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.519085 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.519094 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: E0131 09:01:50.531745 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:50Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.535429 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.535461 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.535468 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.535481 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.535490 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: E0131 09:01:50.547969 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:50Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.551253 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.551287 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.551296 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.551311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.551323 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: E0131 09:01:50.563827 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:50Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:50 crc kubenswrapper[4732]: E0131 09:01:50.563952 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.565730 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.565800 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.565812 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.565848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.565861 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.668182 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.668246 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.668268 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.668298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.668339 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.772137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.772219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.772231 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.772252 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.772264 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.876135 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.876191 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.876206 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.876228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.876243 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.978932 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.978990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.979004 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.979020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:50 crc kubenswrapper[4732]: I0131 09:01:50.979050 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:50Z","lastTransitionTime":"2026-01-31T09:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.082534 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.082611 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.082632 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.082697 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.082739 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:51Z","lastTransitionTime":"2026-01-31T09:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.185398 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.185462 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.185480 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.185506 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.185525 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:51Z","lastTransitionTime":"2026-01-31T09:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.288324 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.288470 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.288486 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.288504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.288516 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:51Z","lastTransitionTime":"2026-01-31T09:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.392027 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.392094 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.392103 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.392123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.392134 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:51Z","lastTransitionTime":"2026-01-31T09:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.496039 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.496100 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.496117 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.496138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.496155 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:51Z","lastTransitionTime":"2026-01-31T09:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.511628 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 00:55:50.110015678 +0000 UTC Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.542302 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.542384 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:51 crc kubenswrapper[4732]: E0131 09:01:51.542470 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.542496 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.542504 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:51 crc kubenswrapper[4732]: E0131 09:01:51.542627 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:51 crc kubenswrapper[4732]: E0131 09:01:51.542767 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:51 crc kubenswrapper[4732]: E0131 09:01:51.543184 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.598521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.598573 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.598585 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.598607 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.598622 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:51Z","lastTransitionTime":"2026-01-31T09:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.701193 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.701266 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.701277 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.701296 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.701308 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:51Z","lastTransitionTime":"2026-01-31T09:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.804380 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.804440 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.804456 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.804480 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.804497 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:51Z","lastTransitionTime":"2026-01-31T09:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.907214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.907259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.907287 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.907307 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.907317 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:51Z","lastTransitionTime":"2026-01-31T09:01:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.920926 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.934492 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.965134 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:51 crc kubenswrapper[4732]: I0131 09:01:51.983695 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.002353 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:51Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.010509 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.010572 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.010587 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.010607 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.010621 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.017923 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.033723 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.048512 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.061431 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.073891 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.096188 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"message\\\":\\\"1 09:01:42.390171 6216 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390285 6216 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390357 6216 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390398 6216 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390856 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:42.390942 6216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:42.391006 6216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:42.391068 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 09:01:42.391132 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 09:01:42.391188 6216 factory.go:656] Stopping watch factory\\\\nI0131 09:01:42.391259 6216 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:01:42.390987 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:42.391113 6216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 09:01:42.391233 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.114325 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.114707 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.114734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.114745 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.114763 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.114774 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.127014 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.137380 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.149640 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.158379 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.172579 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.183807 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.217736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.217780 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.217790 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.217805 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.217814 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.320584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.320626 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.320635 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.320650 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.320659 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.423380 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.423425 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.423434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.423455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.423472 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.512572 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 03:46:07.409834742 +0000 UTC Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.526477 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.526526 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.526543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.526568 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.526585 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.574305 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.593363 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.611743 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.626694 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.629730 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.629847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.629892 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.629920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.629936 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.637028 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.640428 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.647099 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.657257 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.670467 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.685380 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.705444 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"message\\\":\\\"1 09:01:42.390171 6216 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390285 6216 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390357 6216 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390398 6216 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390856 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:42.390942 6216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:42.391006 6216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:42.391068 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 09:01:42.391132 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 09:01:42.391188 6216 factory.go:656] Stopping watch factory\\\\nI0131 09:01:42.391259 6216 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:01:42.390987 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:42.391113 6216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 09:01:42.391233 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.720435 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.732653 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.732736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.732880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.732938 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.732961 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.735294 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.750655 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.765397 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.775580 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.788410 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.798535 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.811012 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.823420 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.834159 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.835791 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.835858 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.835872 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.835895 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.835909 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.845820 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.856574 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.877306 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.888696 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.900803 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.912113 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.924193 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.935157 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.938784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.938848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.938862 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.938878 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.938888 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:52Z","lastTransitionTime":"2026-01-31T09:01:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.944774 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.955460 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.975060 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4b67c56c55214b7fc95fc05a876435bf7a9d6e5de9dfc0a3acc0110e6779b818\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:39Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 09:01:39.556996 6044 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:39.557035 6044 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0131 09:01:39.557080 6044 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 09:01:39.557086 6044 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 09:01:39.557099 6044 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:39.557106 6044 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:39.557121 6044 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 09:01:39.557163 6044 factory.go:656] Stopping watch factory\\\\nI0131 09:01:39.557193 6044 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0131 09:01:39.557202 6044 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:39.557208 6044 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 09:01:39.557213 6044 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0131 09:01:39.557219 6044 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 09:01:39.557225 6044 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 09:01:39.557232 6044 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"message\\\":\\\"1 09:01:42.390171 6216 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390285 6216 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390357 6216 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390398 6216 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390856 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:42.390942 6216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:42.391006 6216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:42.391068 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 09:01:42.391132 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 09:01:42.391188 6216 factory.go:656] Stopping watch factory\\\\nI0131 09:01:42.391259 6216 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:01:42.390987 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:42.391113 6216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 09:01:42.391233 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.988419 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:52 crc kubenswrapper[4732]: I0131 09:01:52.999137 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:52Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.009919 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.022555 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.038539 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:53Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.041361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.041430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.041452 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.041481 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.041501 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.144343 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.144410 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.144421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.144441 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.144462 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.248151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.248239 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.248274 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.248304 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.248326 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.350880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.350988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.351713 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.351798 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.351813 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.454641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.454715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.454727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.454746 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.454759 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.513348 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 13:05:17.022091752 +0000 UTC Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.541742 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.541792 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.541809 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.541747 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:53 crc kubenswrapper[4732]: E0131 09:01:53.541918 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:53 crc kubenswrapper[4732]: E0131 09:01:53.542023 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:53 crc kubenswrapper[4732]: E0131 09:01:53.542217 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:53 crc kubenswrapper[4732]: E0131 09:01:53.542374 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.557525 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.557570 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.557581 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.557601 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.557613 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.660962 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.661012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.661024 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.661043 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.661058 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.765251 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.765301 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.765311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.765328 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.765340 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.868566 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.868633 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.868648 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.868690 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.868707 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.972077 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.972126 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.972138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.972160 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:53 crc kubenswrapper[4732]: I0131 09:01:53.972173 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:53Z","lastTransitionTime":"2026-01-31T09:01:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.074553 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.074588 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.074597 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.074613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.074625 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.177298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.177329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.177337 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.177354 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.177363 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.280267 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.280299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.280309 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.280323 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.280332 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.382867 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.382914 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.382930 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.382949 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.382964 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.485959 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.486009 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.486020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.486038 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.486050 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.514319 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 09:37:45.134939385 +0000 UTC Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.588947 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.588992 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.589005 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.589024 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.589035 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.691733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.691772 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.691782 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.691799 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.691812 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.794158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.794204 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.794215 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.794235 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.794248 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.896508 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.896564 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.896581 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.896604 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.896621 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.999598 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.999648 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.999679 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.999698 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:54 crc kubenswrapper[4732]: I0131 09:01:54.999709 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:54Z","lastTransitionTime":"2026-01-31T09:01:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.103310 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.103373 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.103388 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.103412 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.103427 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:55Z","lastTransitionTime":"2026-01-31T09:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.206139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.206179 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.206188 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.206203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.206213 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:55Z","lastTransitionTime":"2026-01-31T09:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.308627 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.308682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.308695 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.308711 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.308722 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:55Z","lastTransitionTime":"2026-01-31T09:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.385208 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.385347 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.385400 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:02:27.385363088 +0000 UTC m=+85.691239292 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.385439 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.385506 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.385512 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:02:27.385492713 +0000 UTC m=+85.691368997 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.385644 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.385707 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:02:27.38569601 +0000 UTC m=+85.691572214 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.410871 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.410944 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.410970 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.410998 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.411016 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:55Z","lastTransitionTime":"2026-01-31T09:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.487055 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.487132 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.487353 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.487376 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.487391 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.487468 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:02:27.487446551 +0000 UTC m=+85.793322755 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.487486 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.487544 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.487567 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.487708 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:02:27.487637848 +0000 UTC m=+85.793514092 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.514380 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.514419 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.514432 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.514449 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.514459 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:55Z","lastTransitionTime":"2026-01-31T09:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.514477 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 13:47:52.671812301 +0000 UTC Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.541972 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.542097 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.542192 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.542252 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.542207 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.542379 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.542523 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:55 crc kubenswrapper[4732]: E0131 09:01:55.542626 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.543692 4732 scope.go:117] "RemoveContainer" containerID="9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.559128 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.571815 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.588309 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.602903 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.615379 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.619354 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.619414 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.619425 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.619439 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.619448 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:55Z","lastTransitionTime":"2026-01-31T09:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.629710 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.644260 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.663835 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.677398 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.691071 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.704204 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.717984 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.721691 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.721727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.721739 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.721755 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.721766 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:55Z","lastTransitionTime":"2026-01-31T09:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.731212 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.743294 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.762721 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"message\\\":\\\"1 09:01:42.390171 6216 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390285 6216 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390357 6216 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390398 6216 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390856 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:42.390942 6216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:42.391006 6216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:42.391068 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 09:01:42.391132 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 09:01:42.391188 6216 factory.go:656] Stopping watch factory\\\\nI0131 09:01:42.391259 6216 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:01:42.390987 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:42.391113 6216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 09:01:42.391233 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.780008 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.791196 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.800391 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.823652 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.823702 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.823712 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.823728 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.823737 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:55Z","lastTransitionTime":"2026-01-31T09:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.893827 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/1.log" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.896638 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.896816 4732 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.908860 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.918413 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.925647 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.925684 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.925693 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.925708 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.925717 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:55Z","lastTransitionTime":"2026-01-31T09:01:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.937144 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.954100 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.969426 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:55 crc kubenswrapper[4732]: I0131 09:01:55.984062 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.000846 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:55Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.018773 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.027880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.027941 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.027957 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.027984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.028001 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.035996 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.055295 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.081394 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"message\\\":\\\"1 09:01:42.390171 6216 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390285 6216 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390357 6216 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390398 6216 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390856 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:42.390942 6216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:42.391006 6216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:42.391068 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 09:01:42.391132 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 09:01:42.391188 6216 factory.go:656] Stopping watch factory\\\\nI0131 09:01:42.391259 6216 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:01:42.390987 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:42.391113 6216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 09:01:42.391233 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.099123 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.111261 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.123616 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.133736 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.133784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.133799 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.133818 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.133831 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.138508 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.148682 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.161161 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.171410 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.236684 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.236820 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.236831 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.236848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.236860 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.339368 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.339403 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.339413 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.339428 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.339439 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.442332 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.442374 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.442384 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.442404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.442416 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.514706 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 13:45:12.625613424 +0000 UTC Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.545343 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.545406 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.545428 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.545459 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.545480 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.648501 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.648546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.648554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.648571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.648580 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.751751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.751820 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.751848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.751876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.751893 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.854995 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.855077 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.855096 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.855140 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.855180 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.903094 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/2.log" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.903822 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/1.log" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.907392 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16" exitCode=1 Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.907456 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.907514 4732 scope.go:117] "RemoveContainer" containerID="9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.908897 4732 scope.go:117] "RemoveContainer" containerID="da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16" Jan 31 09:01:56 crc kubenswrapper[4732]: E0131 09:01:56.909251 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.939535 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.957007 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.964630 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.964705 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.964721 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.964744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.964760 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:56Z","lastTransitionTime":"2026-01-31T09:01:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.976866 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:56 crc kubenswrapper[4732]: I0131 09:01:56.993047 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:56Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.007251 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.018528 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.029493 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.042320 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.060759 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.067343 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.067398 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.067413 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.067431 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.067444 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.080560 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.096273 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.109595 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.124185 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.135763 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.145497 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.164716 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9455be46c494ee301fbd246b34b851413405354f423be0f5ae070c7c1bb8263a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"message\\\":\\\"1 09:01:42.390171 6216 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390285 6216 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390357 6216 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390398 6216 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 09:01:42.390856 6216 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0131 09:01:42.390942 6216 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0131 09:01:42.391006 6216 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0131 09:01:42.391068 6216 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 09:01:42.391132 6216 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 09:01:42.391188 6216 factory.go:656] Stopping watch factory\\\\nI0131 09:01:42.391259 6216 ovnkube.go:599] Stopped ovnkube\\\\nI0131 09:01:42.390987 6216 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0131 09:01:42.391113 6216 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 09:01:42.391233 6216 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:56Z\\\",\\\"message\\\":\\\"31T09:01:56Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:01:56.451406 6413 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 09:01:56.451418 6413 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451426 6413 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451433 6413 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451434 6413 ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.170621 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.170700 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.170711 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.170728 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.170739 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.189389 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.201845 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:57Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.274645 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.274776 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.274798 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.274825 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.274844 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.376988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.377307 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.377461 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.377594 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.377817 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.480751 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.481730 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.481767 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.481784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.481795 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.515497 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 20:52:15.579391554 +0000 UTC Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.541842 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.541892 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:57 crc kubenswrapper[4732]: E0131 09:01:57.542004 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.541862 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.542093 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:57 crc kubenswrapper[4732]: E0131 09:01:57.542213 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:57 crc kubenswrapper[4732]: E0131 09:01:57.542331 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:57 crc kubenswrapper[4732]: E0131 09:01:57.542466 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.584328 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.584379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.584391 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.584409 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.584422 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.610258 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:57 crc kubenswrapper[4732]: E0131 09:01:57.610474 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:57 crc kubenswrapper[4732]: E0131 09:01:57.610648 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs podName:3bd29a31-1a47-40da-afc5-6c4423067083 nodeName:}" failed. No retries permitted until 2026-01-31 09:02:13.61061626 +0000 UTC m=+71.916492504 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs") pod "network-metrics-daemon-7fgvm" (UID: "3bd29a31-1a47-40da-afc5-6c4423067083") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.687771 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.687810 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.687821 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.687839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.687850 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.790268 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.790701 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.790924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.791082 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.791207 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.894613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.894659 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.894712 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.894738 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.894754 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.912861 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/2.log" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.998003 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.998056 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.998068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.998088 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:57 crc kubenswrapper[4732]: I0131 09:01:57.998101 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:57Z","lastTransitionTime":"2026-01-31T09:01:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.101276 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.101328 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.101340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.101359 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.101373 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:58Z","lastTransitionTime":"2026-01-31T09:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.204563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.204631 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.204648 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.204720 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.204739 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:58Z","lastTransitionTime":"2026-01-31T09:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.308604 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.308937 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.308956 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.308979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.308997 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:58Z","lastTransitionTime":"2026-01-31T09:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.412889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.412960 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.412977 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.413007 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.413034 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:58Z","lastTransitionTime":"2026-01-31T09:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.489086 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.490823 4732 scope.go:117] "RemoveContainer" containerID="da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16" Jan 31 09:01:58 crc kubenswrapper[4732]: E0131 09:01:58.491149 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.509942 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.515758 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 02:38:37.735384677 +0000 UTC Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.516087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.516123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.516138 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.516157 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.516170 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:58Z","lastTransitionTime":"2026-01-31T09:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.530626 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.552488 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.567977 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.594885 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.612374 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.618977 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.619020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.619029 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.619046 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.619058 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:58Z","lastTransitionTime":"2026-01-31T09:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.627424 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.640978 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.650621 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.660456 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.678816 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:56Z\\\",\\\"message\\\":\\\"31T09:01:56Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:01:56.451406 6413 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 09:01:56.451418 6413 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451426 6413 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451433 6413 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451434 6413 ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.693487 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.705420 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.720462 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.721562 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.721622 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.721633 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.721649 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.721685 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:58Z","lastTransitionTime":"2026-01-31T09:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.735354 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.747014 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.759358 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.774321 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:01:58Z is after 2025-08-24T17:21:41Z" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.824744 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.824792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.824804 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.824824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.824835 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:58Z","lastTransitionTime":"2026-01-31T09:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.926555 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.926621 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.926639 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.926686 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:58 crc kubenswrapper[4732]: I0131 09:01:58.926709 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:58Z","lastTransitionTime":"2026-01-31T09:01:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.030143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.030181 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.030214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.030234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.030248 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.132963 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.133023 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.133040 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.133067 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.133088 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.236259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.236323 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.236333 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.236367 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.236380 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.339407 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.339447 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.339461 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.339478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.339489 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.441553 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.441607 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.441617 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.441633 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.441643 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.516414 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 15:58:02.548775864 +0000 UTC Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.541725 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.541765 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.541738 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.541839 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:01:59 crc kubenswrapper[4732]: E0131 09:01:59.541969 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:01:59 crc kubenswrapper[4732]: E0131 09:01:59.542028 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:01:59 crc kubenswrapper[4732]: E0131 09:01:59.542163 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:01:59 crc kubenswrapper[4732]: E0131 09:01:59.542224 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.543583 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.543614 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.543624 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.543637 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.543647 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.646824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.646885 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.646897 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.646918 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.646934 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.749489 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.749551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.749580 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.749596 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.749605 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.853725 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.853786 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.853804 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.853829 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.853847 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.956843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.956912 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.956931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.956959 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:01:59 crc kubenswrapper[4732]: I0131 09:01:59.956979 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:01:59Z","lastTransitionTime":"2026-01-31T09:01:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.059926 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.060054 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.060124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.060154 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.060203 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.163878 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.164801 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.164874 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.164899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.164919 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.268118 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.268574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.268816 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.269068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.269240 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.372546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.372600 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.372613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.372631 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.372644 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.475760 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.475807 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.475822 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.475848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.475861 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.517735 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 10:11:38.944197053 +0000 UTC Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.578914 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.579307 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.579430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.579574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.579715 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.683201 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.683259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.683276 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.683299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.683317 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.786770 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.786826 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.786843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.786871 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.786904 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.826335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.826403 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.826418 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.826443 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.826455 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: E0131 09:02:00.846741 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.852024 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.852096 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.852117 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.852146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.852167 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: E0131 09:02:00.870160 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.876125 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.876193 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.876217 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.876248 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.876273 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: E0131 09:02:00.894698 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.899758 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.899808 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.899824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.899843 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.900185 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: E0131 09:02:00.917520 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.922943 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.922997 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.923020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.923049 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.923074 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:00 crc kubenswrapper[4732]: E0131 09:02:00.950428 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:00Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:00 crc kubenswrapper[4732]: E0131 09:02:00.950719 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.953012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.953085 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.953110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.953144 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:00 crc kubenswrapper[4732]: I0131 09:02:00.953168 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:00Z","lastTransitionTime":"2026-01-31T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.056078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.056113 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.056120 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.056136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.056147 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.158965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.159024 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.159034 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.159051 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.159061 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.261163 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.261227 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.261235 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.261249 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.261259 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.363962 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.364026 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.364047 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.364076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.364095 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.466530 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.466573 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.466583 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.466602 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.466615 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.518453 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 01:29:35.63470091 +0000 UTC Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.541841 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.541841 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:01 crc kubenswrapper[4732]: E0131 09:02:01.541978 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.542089 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:01 crc kubenswrapper[4732]: E0131 09:02:01.542176 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.541861 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:01 crc kubenswrapper[4732]: E0131 09:02:01.542271 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:01 crc kubenswrapper[4732]: E0131 09:02:01.542339 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.569354 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.569388 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.569397 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.569413 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.569423 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.672211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.672262 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.672275 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.672296 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.672336 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.775078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.775133 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.775146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.775167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.775181 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.878019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.878073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.878087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.878103 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.878113 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.980278 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.980606 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.980615 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.980633 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:01 crc kubenswrapper[4732]: I0131 09:02:01.980644 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:01Z","lastTransitionTime":"2026-01-31T09:02:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.083936 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.083997 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.084016 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.084037 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.084048 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:02Z","lastTransitionTime":"2026-01-31T09:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.186963 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.187061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.187080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.187106 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.187123 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:02Z","lastTransitionTime":"2026-01-31T09:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.290370 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.291146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.291268 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.291351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.291432 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:02Z","lastTransitionTime":"2026-01-31T09:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.393554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.393593 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.393604 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.393621 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.393633 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:02Z","lastTransitionTime":"2026-01-31T09:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.496826 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.496879 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.496892 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.496910 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.496923 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:02Z","lastTransitionTime":"2026-01-31T09:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.519653 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 05:21:44.064431942 +0000 UTC Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.554916 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.565764 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.576598 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.594196 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.600152 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.600192 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.600200 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.600219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.600229 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:02Z","lastTransitionTime":"2026-01-31T09:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.607198 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.619493 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.630889 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.642573 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.654301 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.665742 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.688309 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:56Z\\\",\\\"message\\\":\\\"31T09:01:56Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:01:56.451406 6413 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 09:01:56.451418 6413 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451426 6413 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451433 6413 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451434 6413 ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.701042 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.707231 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.707357 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.707382 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.707408 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.707421 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:02Z","lastTransitionTime":"2026-01-31T09:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.719547 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.734466 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.749598 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.763378 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.780160 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.790399 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:02Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.808853 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.808894 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.808906 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.808921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.808932 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:02Z","lastTransitionTime":"2026-01-31T09:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.912025 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.912078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.912091 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.912111 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:02 crc kubenswrapper[4732]: I0131 09:02:02.912124 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:02Z","lastTransitionTime":"2026-01-31T09:02:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.016070 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.016126 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.016139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.016160 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.016172 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.119048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.119333 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.119403 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.119845 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.119932 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.222597 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.222635 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.222646 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.222661 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.222683 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.326032 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.327109 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.327207 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.327306 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.327376 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.429914 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.429961 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.429970 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.429989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.430000 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.520071 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 00:09:32.861452927 +0000 UTC Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.533547 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.533601 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.533613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.533641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.533692 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.542351 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.542402 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.542385 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.542359 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:03 crc kubenswrapper[4732]: E0131 09:02:03.542553 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:03 crc kubenswrapper[4732]: E0131 09:02:03.542701 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:03 crc kubenswrapper[4732]: E0131 09:02:03.542787 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:03 crc kubenswrapper[4732]: E0131 09:02:03.542840 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.636002 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.636034 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.636045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.636063 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.636075 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.738892 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.738951 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.738967 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.738993 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.739012 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.841404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.841480 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.841508 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.841537 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.841555 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.943904 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.943968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.943979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.943999 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:03 crc kubenswrapper[4732]: I0131 09:02:03.944011 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:03Z","lastTransitionTime":"2026-01-31T09:02:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.047005 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.047421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.047657 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.047944 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.048300 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.152143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.152196 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.152205 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.152222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.152233 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.255720 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.255790 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.255812 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.255839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.255857 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.358520 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.358608 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.358631 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.358708 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.358744 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.461989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.462056 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.462081 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.462109 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.462131 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.521381 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 01:13:41.288329444 +0000 UTC Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.565157 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.565229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.565249 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.565278 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.565302 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.668449 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.668497 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.668507 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.668522 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.668532 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.772450 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.772505 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.772520 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.772542 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.772556 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.875608 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.875946 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.876053 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.876155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.876252 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.979562 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.979611 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.979626 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.979647 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:04 crc kubenswrapper[4732]: I0131 09:02:04.979713 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:04Z","lastTransitionTime":"2026-01-31T09:02:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.082508 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.082556 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.082571 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.082589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.082600 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:05Z","lastTransitionTime":"2026-01-31T09:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.185169 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.185212 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.185227 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.185244 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.185257 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:05Z","lastTransitionTime":"2026-01-31T09:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.288054 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.288120 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.288137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.288162 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.288178 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:05Z","lastTransitionTime":"2026-01-31T09:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.391930 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.391965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.391979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.391997 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.392008 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:05Z","lastTransitionTime":"2026-01-31T09:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.495214 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.495254 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.495267 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.495284 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.495293 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:05Z","lastTransitionTime":"2026-01-31T09:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.521607 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 08:19:56.918615212 +0000 UTC Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.542013 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:05 crc kubenswrapper[4732]: E0131 09:02:05.542159 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.542528 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:05 crc kubenswrapper[4732]: E0131 09:02:05.542576 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.542609 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:05 crc kubenswrapper[4732]: E0131 09:02:05.542647 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.542707 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:05 crc kubenswrapper[4732]: E0131 09:02:05.542761 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.598259 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.598288 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.598297 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.598312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.598322 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:05Z","lastTransitionTime":"2026-01-31T09:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.700544 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.700572 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.700581 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.700595 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.700606 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:05Z","lastTransitionTime":"2026-01-31T09:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.803388 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.803424 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.803434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.803451 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.803461 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:05Z","lastTransitionTime":"2026-01-31T09:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.906183 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.906219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.906228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.906245 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:05 crc kubenswrapper[4732]: I0131 09:02:05.906256 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:05Z","lastTransitionTime":"2026-01-31T09:02:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.009166 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.009202 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.009211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.009228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.009237 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.112699 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.112776 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.112791 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.112811 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.112826 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.216175 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.216250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.216266 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.216292 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.216310 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.319574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.319654 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.319717 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.319757 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.319779 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.423405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.423468 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.423486 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.423512 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.423529 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.522806 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:07:28.247750609 +0000 UTC Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.526090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.526170 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.526220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.526247 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.526259 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.629273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.629350 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.629363 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.629382 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.629399 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.732312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.732362 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.732373 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.732390 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.732407 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.835969 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.836026 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.836046 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.836069 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.836081 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.938976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.939020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.939037 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.939054 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:06 crc kubenswrapper[4732]: I0131 09:02:06.939069 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:06Z","lastTransitionTime":"2026-01-31T09:02:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.042354 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.042397 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.042407 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.042426 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.042440 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.145874 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.145954 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.145973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.146003 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.146027 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.249026 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.249073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.249084 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.249102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.249113 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.352299 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.352350 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.352358 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.352376 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.352387 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.456102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.456158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.456174 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.456193 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.456204 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.523983 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:59:07.340286867 +0000 UTC Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.541695 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.541703 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.541812 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:07 crc kubenswrapper[4732]: E0131 09:02:07.541843 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:07 crc kubenswrapper[4732]: E0131 09:02:07.541895 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.541725 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:07 crc kubenswrapper[4732]: E0131 09:02:07.542089 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:07 crc kubenswrapper[4732]: E0131 09:02:07.542264 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.563448 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.563563 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.563576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.563596 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.563612 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.666072 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.666112 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.666121 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.666136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.666146 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.768981 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.769027 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.769040 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.769060 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.769074 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.871352 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.871395 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.871405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.871422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.871431 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.974317 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.974361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.974369 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.974385 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:07 crc kubenswrapper[4732]: I0131 09:02:07.974394 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:07Z","lastTransitionTime":"2026-01-31T09:02:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.076763 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.076802 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.076811 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.076826 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.076836 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:08Z","lastTransitionTime":"2026-01-31T09:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.178823 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.178855 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.178865 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.178880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.178890 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:08Z","lastTransitionTime":"2026-01-31T09:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.282023 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.282050 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.282060 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.282076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.282084 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:08Z","lastTransitionTime":"2026-01-31T09:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.384802 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.384838 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.384852 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.384869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.384880 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:08Z","lastTransitionTime":"2026-01-31T09:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.486890 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.486926 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.486937 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.486956 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.486968 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:08Z","lastTransitionTime":"2026-01-31T09:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.524915 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 04:55:02.645374706 +0000 UTC Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.590081 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.590118 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.590127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.590144 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.590156 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:08Z","lastTransitionTime":"2026-01-31T09:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.693155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.693202 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.693212 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.693229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.693239 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:08Z","lastTransitionTime":"2026-01-31T09:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.796054 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.796110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.796121 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.796141 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.796152 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:08Z","lastTransitionTime":"2026-01-31T09:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.898038 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.898087 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.898097 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.898114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:08 crc kubenswrapper[4732]: I0131 09:02:08.898124 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:08Z","lastTransitionTime":"2026-01-31T09:02:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.000728 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.000777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.000790 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.000809 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.000821 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.103417 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.103478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.103491 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.103513 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.103529 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.205985 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.206037 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.206055 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.206076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.206087 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.307813 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.307875 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.307895 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.307921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.307938 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.410734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.410778 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.410789 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.410806 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.410821 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.513578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.513619 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.513629 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.513647 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.513656 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.525022 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 22:25:12.216525701 +0000 UTC Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.542680 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.542716 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.542683 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.542683 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:09 crc kubenswrapper[4732]: E0131 09:02:09.542829 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:09 crc kubenswrapper[4732]: E0131 09:02:09.542931 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:09 crc kubenswrapper[4732]: E0131 09:02:09.543027 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:09 crc kubenswrapper[4732]: E0131 09:02:09.543105 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.616068 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.616110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.616119 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.616135 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.616152 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.718966 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.719008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.719019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.719035 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.719044 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.821615 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.821676 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.821687 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.821706 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.821721 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.924180 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.924238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.924250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.924268 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:09 crc kubenswrapper[4732]: I0131 09:02:09.924277 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:09Z","lastTransitionTime":"2026-01-31T09:02:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.029968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.030019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.030030 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.030048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.030061 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.134055 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.134134 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.134151 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.134172 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.134187 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.236836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.236898 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.236920 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.236950 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.236972 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.340446 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.340506 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.340520 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.340540 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.340555 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.443904 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.443965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.443979 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.444001 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.444016 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.525288 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 22:02:50.110647002 +0000 UTC Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.546676 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.546750 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.546764 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.546783 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.546799 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.554376 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.649097 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.649143 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.649155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.649180 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.649205 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.751409 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.751446 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.751455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.751471 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.751482 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.854372 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.854435 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.854454 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.854472 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.854483 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.957279 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.957318 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.957329 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.957346 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:10 crc kubenswrapper[4732]: I0131 09:02:10.957358 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:10Z","lastTransitionTime":"2026-01-31T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.060128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.060155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.060164 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.060178 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.060187 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.162364 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.162407 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.162421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.162438 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.162450 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.264541 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.264596 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.264609 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.264628 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.264640 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.292258 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.292529 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.292545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.292565 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.292578 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.304850 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.309419 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.309584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.309672 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.309761 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.309860 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.323099 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.327178 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.327220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.327230 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.327248 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.327261 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.342120 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.346523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.346553 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.346561 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.346578 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.346587 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.358410 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.361590 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.361630 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.361641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.361675 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.361688 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.371844 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:11Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.371953 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.373840 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.373860 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.373871 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.373884 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.373894 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.476910 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.476955 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.476968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.476985 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.477001 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.525694 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 15:50:43.966548895 +0000 UTC Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.541985 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.542087 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.542185 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.542208 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.542263 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.542320 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.541985 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:11 crc kubenswrapper[4732]: E0131 09:02:11.542810 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.579480 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.579555 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.579569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.579609 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.579622 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.683641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.684245 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.684319 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.684511 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.684581 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.787234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.787269 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.787279 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.787296 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.787305 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.890392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.890466 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.890484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.890510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.890531 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.993399 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.993455 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.993469 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.993490 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:11 crc kubenswrapper[4732]: I0131 09:02:11.993504 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:11Z","lastTransitionTime":"2026-01-31T09:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.096374 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.096434 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.096446 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.096467 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.096480 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:12Z","lastTransitionTime":"2026-01-31T09:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.198973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.199025 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.199040 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.199059 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.199072 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:12Z","lastTransitionTime":"2026-01-31T09:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.301443 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.301484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.301493 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.301510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.301521 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:12Z","lastTransitionTime":"2026-01-31T09:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.403796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.403850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.403859 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.403877 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.403888 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:12Z","lastTransitionTime":"2026-01-31T09:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.506079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.506115 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.506124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.506139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.506148 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:12Z","lastTransitionTime":"2026-01-31T09:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.526784 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 09:24:30.225627936 +0000 UTC Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.543454 4732 scope.go:117] "RemoveContainer" containerID="da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16" Jan 31 09:02:12 crc kubenswrapper[4732]: E0131 09:02:12.543842 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.553972 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.568825 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.582799 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.595439 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.609195 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.609238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.609252 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.609270 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.609284 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:12Z","lastTransitionTime":"2026-01-31T09:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.614930 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.628500 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.640547 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.652781 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.669279 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.684972 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.696787 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.708846 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ec8fd54-25d6-41f1-9a8b-0e2823c951b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87eb9563b2ca7420cbb70ccd32aa77f2a82c769d57eceacd99a4d94cb1c3a0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.712253 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.712306 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.712317 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.712338 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.712350 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:12Z","lastTransitionTime":"2026-01-31T09:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.722398 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.742133 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.759835 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:56Z\\\",\\\"message\\\":\\\"31T09:01:56Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:01:56.451406 6413 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 09:01:56.451418 6413 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451426 6413 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451433 6413 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451434 6413 ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.773136 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.790125 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.800140 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.810814 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:12Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.814454 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.814490 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.814499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.814515 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.814526 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:12Z","lastTransitionTime":"2026-01-31T09:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.916576 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.916617 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.916626 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.916645 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:12 crc kubenswrapper[4732]: I0131 09:02:12.916656 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:12Z","lastTransitionTime":"2026-01-31T09:02:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.018905 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.018949 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.018990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.019014 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.019026 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.122640 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.122723 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.122737 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.122788 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.122802 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.225816 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.225856 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.225866 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.225880 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.225891 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.328321 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.328387 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.328400 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.328421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.328433 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.430786 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.430839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.430851 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.430869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.430884 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.527232 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 06:09:45.392045596 +0000 UTC Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.533098 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.533172 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.533208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.533225 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.533237 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.542650 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.542706 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.542763 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:13 crc kubenswrapper[4732]: E0131 09:02:13.542794 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.542706 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:13 crc kubenswrapper[4732]: E0131 09:02:13.542876 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:13 crc kubenswrapper[4732]: E0131 09:02:13.542932 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:13 crc kubenswrapper[4732]: E0131 09:02:13.543014 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.635423 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.635475 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.635486 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.635504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.635516 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.680379 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:13 crc kubenswrapper[4732]: E0131 09:02:13.680635 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:02:13 crc kubenswrapper[4732]: E0131 09:02:13.680771 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs podName:3bd29a31-1a47-40da-afc5-6c4423067083 nodeName:}" failed. No retries permitted until 2026-01-31 09:02:45.680741408 +0000 UTC m=+103.986617612 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs") pod "network-metrics-daemon-7fgvm" (UID: "3bd29a31-1a47-40da-afc5-6c4423067083") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.738158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.738207 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.738222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.738242 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.738254 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.840269 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.840312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.840323 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.840340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.840350 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.943238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.943289 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.943298 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.943313 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:13 crc kubenswrapper[4732]: I0131 09:02:13.943323 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:13Z","lastTransitionTime":"2026-01-31T09:02:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.045519 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.045560 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.045569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.045584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.045593 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.148357 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.148412 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.148422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.148439 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.148456 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.251159 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.251194 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.251203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.251220 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.251230 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.355422 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.355803 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.355900 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.356377 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.356480 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.459352 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.459393 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.459404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.459421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.459433 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.528328 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 19:34:34.583657141 +0000 UTC Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.561644 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.561741 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.561753 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.561768 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.561780 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.664135 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.664173 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.664184 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.664200 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.664213 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.766254 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.766443 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.766550 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.766695 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.766796 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.868899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.868939 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.868952 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.868970 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.868982 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.974961 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.974996 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.975004 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.975017 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:14 crc kubenswrapper[4732]: I0131 09:02:14.975027 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:14Z","lastTransitionTime":"2026-01-31T09:02:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.077210 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.077235 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.077242 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.077273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.077282 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:15Z","lastTransitionTime":"2026-01-31T09:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.179507 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.179769 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.179884 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.179995 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.180118 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:15Z","lastTransitionTime":"2026-01-31T09:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.282917 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.283221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.283311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.283379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.283438 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:15Z","lastTransitionTime":"2026-01-31T09:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.386047 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.386414 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.386476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.386551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.386609 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:15Z","lastTransitionTime":"2026-01-31T09:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.489830 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.490311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.490438 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.490557 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.490646 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:15Z","lastTransitionTime":"2026-01-31T09:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.529505 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 00:59:17.435104169 +0000 UTC Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.542021 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:15 crc kubenswrapper[4732]: E0131 09:02:15.542202 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.542441 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.542600 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.542676 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:15 crc kubenswrapper[4732]: E0131 09:02:15.542627 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:15 crc kubenswrapper[4732]: E0131 09:02:15.542858 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:15 crc kubenswrapper[4732]: E0131 09:02:15.543016 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.593379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.593715 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.593816 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.593897 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.593976 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:15Z","lastTransitionTime":"2026-01-31T09:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.697242 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.697292 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.697304 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.697332 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.697344 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:15Z","lastTransitionTime":"2026-01-31T09:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.800629 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.801080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.801219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.801386 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.801541 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:15Z","lastTransitionTime":"2026-01-31T09:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.904561 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.904608 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.904624 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.904651 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.904700 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:15Z","lastTransitionTime":"2026-01-31T09:02:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.981737 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/0.log" Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.981829 4732 generic.go:334] "Generic (PLEG): container finished" podID="8e23192f-14db-41ef-af89-4a76e325d9c1" containerID="e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56" exitCode=1 Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.981889 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mxsr" event={"ID":"8e23192f-14db-41ef-af89-4a76e325d9c1","Type":"ContainerDied","Data":"e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56"} Jan 31 09:02:15 crc kubenswrapper[4732]: I0131 09:02:15.982645 4732 scope.go:117] "RemoveContainer" containerID="e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.000400 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:15Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.006938 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.006965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.006972 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.006987 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.006998 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.017954 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.030188 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.042393 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.077915 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:56Z\\\",\\\"message\\\":\\\"31T09:01:56Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:01:56.451406 6413 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 09:01:56.451418 6413 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451426 6413 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451433 6413 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451434 6413 ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.093004 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.105883 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.110427 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.110503 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.110516 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.110562 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.110576 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.121002 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:02:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:01:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4\\\\n2026-01-31T09:01:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4 to /host/opt/cni/bin/\\\\n2026-01-31T09:01:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:01:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:02:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.135341 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.150534 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.165360 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.177292 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.190260 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.203719 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.212989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.213044 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.213058 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.213077 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.213094 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.215948 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.226225 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ec8fd54-25d6-41f1-9a8b-0e2823c951b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87eb9563b2ca7420cbb70ccd32aa77f2a82c769d57eceacd99a4d94cb1c3a0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.237088 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.259056 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.275004 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:16Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.315594 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.315958 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.316052 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.316137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.316216 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.418161 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.418397 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.418492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.418565 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.418635 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.521885 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.522203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.522342 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.522521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.522754 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.530034 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 23:01:35.77226994 +0000 UTC Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.626086 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.626753 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.626961 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.627198 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.627411 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.730327 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.730369 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.730379 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.730398 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.730407 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.833743 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.833789 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.833800 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.833822 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.833837 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.936785 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.937095 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.937186 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.937283 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.937363 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:16Z","lastTransitionTime":"2026-01-31T09:02:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.988765 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/0.log" Jan 31 09:02:16 crc kubenswrapper[4732]: I0131 09:02:16.988876 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mxsr" event={"ID":"8e23192f-14db-41ef-af89-4a76e325d9c1","Type":"ContainerStarted","Data":"456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.005888 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.018510 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.030984 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.041551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.041761 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.041827 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.041909 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.041975 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.051288 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:56Z\\\",\\\"message\\\":\\\"31T09:01:56Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:01:56.451406 6413 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 09:01:56.451418 6413 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451426 6413 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451433 6413 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451434 6413 ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.064098 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.075839 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.090836 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:02:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:01:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4\\\\n2026-01-31T09:01:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4 to /host/opt/cni/bin/\\\\n2026-01-31T09:01:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:01:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:02:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.105563 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.118851 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.140292 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.144334 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.144371 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.144383 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.144402 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.144414 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.153796 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.166335 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.175492 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ec8fd54-25d6-41f1-9a8b-0e2823c951b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87eb9563b2ca7420cbb70ccd32aa77f2a82c769d57eceacd99a4d94cb1c3a0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.185188 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.203412 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.215981 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.226359 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.236973 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.251225 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:17Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.251796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.251829 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.251840 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.251863 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.251877 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.354498 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.354543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.354553 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.354573 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.354585 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.467593 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.467643 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.467654 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.467694 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.467705 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.530704 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 18:30:23.865734598 +0000 UTC Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.542068 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.542125 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.542179 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.542178 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:17 crc kubenswrapper[4732]: E0131 09:02:17.542975 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:17 crc kubenswrapper[4732]: E0131 09:02:17.542694 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:17 crc kubenswrapper[4732]: E0131 09:02:17.542776 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:17 crc kubenswrapper[4732]: E0131 09:02:17.542471 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.570927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.570976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.570992 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.571012 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.571024 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.681549 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.681585 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.681597 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.681616 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.681627 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.784960 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.785058 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.785096 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.785131 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.785153 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.888405 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.888461 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.888472 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.888492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.888504 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.992046 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.992129 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.992145 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.992163 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:17 crc kubenswrapper[4732]: I0131 09:02:17.992174 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:17Z","lastTransitionTime":"2026-01-31T09:02:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.097083 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.097137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.097155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.097180 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.097197 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:18Z","lastTransitionTime":"2026-01-31T09:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.200378 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.200909 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.200948 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.201048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.201081 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:18Z","lastTransitionTime":"2026-01-31T09:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.305101 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.305158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.305172 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.305199 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.305216 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:18Z","lastTransitionTime":"2026-01-31T09:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.408768 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.408836 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.408850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.408869 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.408882 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:18Z","lastTransitionTime":"2026-01-31T09:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.512460 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.512534 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.512552 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.512579 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.512601 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:18Z","lastTransitionTime":"2026-01-31T09:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.530821 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 22:45:19.520000611 +0000 UTC Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.616110 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.616203 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.616225 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.616273 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.616304 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:18Z","lastTransitionTime":"2026-01-31T09:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.718850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.718889 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.718898 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.718913 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.718922 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:18Z","lastTransitionTime":"2026-01-31T09:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.822491 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.822540 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.822551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.822570 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.822585 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:18Z","lastTransitionTime":"2026-01-31T09:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.925306 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.925352 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.925364 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.925384 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:18 crc kubenswrapper[4732]: I0131 09:02:18.925400 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:18Z","lastTransitionTime":"2026-01-31T09:02:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.028018 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.028079 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.028095 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.028122 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.028141 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.131453 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.131524 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.131543 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.131570 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.131589 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.234435 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.234492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.234504 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.234522 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.234561 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.337737 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.337796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.337811 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.337831 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.337845 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.440495 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.440541 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.440551 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.440589 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.440599 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.531716 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:26:17.31606912 +0000 UTC Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.541917 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.541977 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.541977 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.542032 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:19 crc kubenswrapper[4732]: E0131 09:02:19.542188 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:19 crc kubenswrapper[4732]: E0131 09:02:19.542326 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:19 crc kubenswrapper[4732]: E0131 09:02:19.542464 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:19 crc kubenswrapper[4732]: E0131 09:02:19.542604 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.543158 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.543198 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.543212 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.543233 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.543250 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.645628 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.645701 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.645717 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.645737 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.645750 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.748721 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.748766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.748778 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.748796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.748808 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.852350 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.852403 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.852415 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.852435 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.852451 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.955404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.955454 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.955464 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.955483 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:19 crc kubenswrapper[4732]: I0131 09:02:19.955493 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:19Z","lastTransitionTime":"2026-01-31T09:02:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.058390 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.058457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.058477 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.058498 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.058515 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.160710 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.160761 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.160772 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.160790 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.160802 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.263548 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.263599 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.263611 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.263629 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.263643 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.366444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.366505 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.366516 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.366536 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.366546 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.469163 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.469211 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.469221 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.469239 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.469252 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.532780 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 00:41:12.233807632 +0000 UTC Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.572508 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.572765 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.572793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.572827 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.572867 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.676740 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.676785 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.676794 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.676809 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.676819 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.780224 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.780340 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.780368 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.780404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.780428 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.884190 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.884266 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.884284 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.884311 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.884330 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.987512 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.987553 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.987565 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.987584 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:20 crc kubenswrapper[4732]: I0131 09:02:20.987593 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:20Z","lastTransitionTime":"2026-01-31T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.090485 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.090532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.090549 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.090570 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.090594 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.194074 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.194121 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.194132 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.194155 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.194172 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.296806 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.297216 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.297326 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.297406 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.297490 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.390070 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.390357 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.390436 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.390569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.390684 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.408046 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.412740 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.412976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.413048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.413123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.413195 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.426538 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.431511 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.431569 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.431582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.431605 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.431618 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.445113 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.449181 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.449222 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.449234 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.449251 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.449262 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.464062 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.468319 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.468369 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.468380 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.468398 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.468409 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.481986 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2761c117-4c0c-4c53-891e-fc7b8fbd4017\\\",\\\"systemUUID\\\":\\\"5f9a5b0b-6336-4588-8df8-98fcbdc2a984\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:21Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.482414 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.484487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.484647 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.484813 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.484944 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.485047 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.532969 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 02:17:26.111811119 +0000 UTC Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.542286 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.542305 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.542446 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.542466 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.542994 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.543054 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.543415 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:21 crc kubenswrapper[4732]: E0131 09:02:21.543524 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.588056 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.588101 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.588111 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.588127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.588138 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.690777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.690829 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.690844 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.690863 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.690886 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.793635 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.793699 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.793713 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.793731 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.793743 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.896898 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.896943 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.896954 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.896970 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.896985 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.999378 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.999435 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.999451 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.999473 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:21 crc kubenswrapper[4732]: I0131 09:02:21.999489 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:21Z","lastTransitionTime":"2026-01-31T09:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.104045 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.104409 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.104492 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.104568 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.104648 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:22Z","lastTransitionTime":"2026-01-31T09:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.207338 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.207383 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.207395 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.207412 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.207423 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:22Z","lastTransitionTime":"2026-01-31T09:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.309399 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.309442 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.309457 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.309476 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.309490 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:22Z","lastTransitionTime":"2026-01-31T09:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.412080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.412127 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.412137 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.412154 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.412166 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:22Z","lastTransitionTime":"2026-01-31T09:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.514820 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.514848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.514856 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.514871 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.514880 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:22Z","lastTransitionTime":"2026-01-31T09:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.533403 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 09:59:09.923233766 +0000 UTC Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.554863 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.566014 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.577987 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.596860 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.607265 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ec8fd54-25d6-41f1-9a8b-0e2823c951b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87eb9563b2ca7420cbb70ccd32aa77f2a82c769d57eceacd99a4d94cb1c3a0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.617976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.618014 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.618026 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.618046 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.618059 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:22Z","lastTransitionTime":"2026-01-31T09:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.621147 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.644208 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.660764 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.677613 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.690274 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.701584 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.713898 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.720451 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.720487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.720498 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.720515 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.720527 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:22Z","lastTransitionTime":"2026-01-31T09:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.727612 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.754819 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:56Z\\\",\\\"message\\\":\\\"31T09:01:56Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:01:56.451406 6413 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 09:01:56.451418 6413 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451426 6413 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451433 6413 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451434 6413 ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.771810 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.783810 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.795596 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:02:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:01:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4\\\\n2026-01-31T09:01:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4 to /host/opt/cni/bin/\\\\n2026-01-31T09:01:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:01:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:02:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.811767 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.821523 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:22Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.822796 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.822914 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.822992 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.823098 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.823173 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:22Z","lastTransitionTime":"2026-01-31T09:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.924792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.924848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.924864 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.924888 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:22 crc kubenswrapper[4732]: I0131 09:02:22.924906 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:22Z","lastTransitionTime":"2026-01-31T09:02:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.027123 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.027585 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.027694 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.027804 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.027898 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.130844 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.130895 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.130907 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.130924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.130937 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.233909 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.233962 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.233975 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.233994 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.234007 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.338236 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.338281 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.338293 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.338312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.338325 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.441693 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.441762 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.441784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.441811 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.441830 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.533517 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 15:11:52.917904119 +0000 UTC Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.541939 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.541985 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.542152 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:23 crc kubenswrapper[4732]: E0131 09:02:23.542390 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.542448 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:23 crc kubenswrapper[4732]: E0131 09:02:23.542625 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:23 crc kubenswrapper[4732]: E0131 09:02:23.542826 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:23 crc kubenswrapper[4732]: E0131 09:02:23.542947 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.549466 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.549510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.549521 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.549541 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.549554 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.652238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.652312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.652337 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.652369 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.652391 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.755078 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.755117 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.755130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.755146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.755157 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.858232 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.858277 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.858289 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.858308 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.858322 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.960973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.961000 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.961008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.961022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:23 crc kubenswrapper[4732]: I0131 09:02:23.961031 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:23Z","lastTransitionTime":"2026-01-31T09:02:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.063326 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.063367 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.063376 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.063392 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.063402 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.165748 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.165777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.165785 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.165820 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.165833 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.268835 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.268902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.268926 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.268957 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.268981 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.372598 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.372695 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.372821 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.372847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.372867 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.475610 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.475653 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.475695 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.475718 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.475733 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.534438 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 08:57:44.824440524 +0000 UTC Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.578777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.578820 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.578830 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.578846 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.578855 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.682428 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.682517 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.682533 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.682557 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.682573 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.785130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.785195 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.785208 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.785227 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.785241 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.888342 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.888385 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.888396 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.888414 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.888426 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.991250 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.991310 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.991325 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.991348 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:24 crc kubenswrapper[4732]: I0131 09:02:24.991361 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:24Z","lastTransitionTime":"2026-01-31T09:02:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.093988 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.094048 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.094061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.094080 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.094092 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:25Z","lastTransitionTime":"2026-01-31T09:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.197660 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.197727 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.197737 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.197756 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.197767 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:25Z","lastTransitionTime":"2026-01-31T09:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.300325 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.300409 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.300421 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.300439 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.300451 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:25Z","lastTransitionTime":"2026-01-31T09:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.402462 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.402499 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.402509 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.402523 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.402532 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:25Z","lastTransitionTime":"2026-01-31T09:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.505003 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.505076 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.505099 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.505130 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.505151 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:25Z","lastTransitionTime":"2026-01-31T09:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.534832 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 20:42:18.0853413 +0000 UTC Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.542162 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.542201 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.542258 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.542169 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:25 crc kubenswrapper[4732]: E0131 09:02:25.542353 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:25 crc kubenswrapper[4732]: E0131 09:02:25.542499 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:25 crc kubenswrapper[4732]: E0131 09:02:25.542612 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:25 crc kubenswrapper[4732]: E0131 09:02:25.542780 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.608094 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.608139 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.608150 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.608167 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.608178 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:25Z","lastTransitionTime":"2026-01-31T09:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.711486 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.711545 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.711559 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.711582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.711596 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:25Z","lastTransitionTime":"2026-01-31T09:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.813968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.814011 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.814022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.814039 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.814051 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:25Z","lastTransitionTime":"2026-01-31T09:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.917567 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.917690 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.917716 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.917747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:25 crc kubenswrapper[4732]: I0131 09:02:25.917764 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:25Z","lastTransitionTime":"2026-01-31T09:02:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.019487 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.019532 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.019544 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.019574 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.019587 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.122406 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.122484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.122510 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.122542 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.122630 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.229067 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.229109 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.229117 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.229131 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.229141 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.332358 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.332430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.332450 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.332478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.332498 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.434784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.434842 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.434858 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.434881 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.434896 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.535860 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:02:56.16417558 +0000 UTC Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.538686 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.538733 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.538747 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.538766 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.538778 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.543896 4732 scope.go:117] "RemoveContainer" containerID="da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.643384 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.643901 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.643921 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.643947 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.643963 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.747375 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.747442 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.747459 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.747484 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.747503 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.850982 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.851024 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.851041 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.851064 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.851079 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.953497 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.953527 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.953536 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.953549 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:26 crc kubenswrapper[4732]: I0131 09:02:26.953558 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:26Z","lastTransitionTime":"2026-01-31T09:02:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.024008 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/2.log" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.026653 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.027173 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.040028 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.053857 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.055582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.055613 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.055623 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.055641 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.055652 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.081006 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:56Z\\\",\\\"message\\\":\\\"31T09:01:56Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:01:56.451406 6413 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 09:01:56.451418 6413 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451426 6413 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451433 6413 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451434 6413 ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.105314 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.118543 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.132709 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:02:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:01:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4\\\\n2026-01-31T09:01:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4 to /host/opt/cni/bin/\\\\n2026-01-31T09:01:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:01:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:02:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.146406 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.155827 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.157486 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.157517 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.157528 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.157546 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.157557 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.167929 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.221860 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.236330 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.249279 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.259739 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.259776 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.259788 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.259805 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.259816 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.263263 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.281525 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.294128 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.309542 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.322622 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.337105 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.347471 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ec8fd54-25d6-41f1-9a8b-0e2823c951b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87eb9563b2ca7420cbb70ccd32aa77f2a82c769d57eceacd99a4d94cb1c3a0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.362173 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.362219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.362228 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.362243 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.362253 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.433321 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.433430 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.433470 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.433446721 +0000 UTC m=+149.739322925 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.433512 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.433584 4732 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.433679 4732 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.433708 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.433642967 +0000 UTC m=+149.739519181 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.434331 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.433786882 +0000 UTC m=+149.739663086 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.465117 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.465516 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.465582 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.465653 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.465772 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.534411 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.534527 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.534764 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.534793 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.534817 4732 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.534896 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.534869559 +0000 UTC m=+149.840745813 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.535194 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.535294 4732 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.535361 4732 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.535518 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.53549355 +0000 UTC m=+149.841369754 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.537023 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 19:23:53.138489304 +0000 UTC Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.542417 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.542437 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.542478 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.542592 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.542611 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.542783 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.542716 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:27 crc kubenswrapper[4732]: E0131 09:02:27.543010 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.568459 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.568520 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.568534 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.568554 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.568565 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.671014 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.671060 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.671071 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.671090 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.671102 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.774124 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.774213 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.774251 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.774283 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.774305 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.877912 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.877973 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.877984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.878016 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.878028 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.981331 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.981795 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.981924 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.982008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:27 crc kubenswrapper[4732]: I0131 09:02:27.982075 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:27Z","lastTransitionTime":"2026-01-31T09:02:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.085793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.085844 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.085859 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.085878 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.085894 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:28Z","lastTransitionTime":"2026-01-31T09:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.189312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.189375 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.189391 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.189413 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.189428 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:28Z","lastTransitionTime":"2026-01-31T09:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.292753 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.292850 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.292899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.292927 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.292945 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:28Z","lastTransitionTime":"2026-01-31T09:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.395887 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.396281 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.396444 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.396588 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.396755 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:28Z","lastTransitionTime":"2026-01-31T09:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.499262 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.499302 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.499312 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.499328 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.499338 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:28Z","lastTransitionTime":"2026-01-31T09:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.537378 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 01:52:42.318378592 +0000 UTC Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.602033 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.602102 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.602119 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.602144 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.602162 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:28Z","lastTransitionTime":"2026-01-31T09:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.705061 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.705121 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.705136 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.705159 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.705178 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:28Z","lastTransitionTime":"2026-01-31T09:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.807902 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.807976 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.807989 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.808019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.808032 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:28Z","lastTransitionTime":"2026-01-31T09:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.910728 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.910792 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.910813 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.910839 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:28 crc kubenswrapper[4732]: I0131 09:02:28.910860 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:28Z","lastTransitionTime":"2026-01-31T09:02:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.013848 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.013896 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.013905 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.013923 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.013933 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.034532 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/3.log" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.035143 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/2.log" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.037553 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" exitCode=1 Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.037598 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.037644 4732 scope.go:117] "RemoveContainer" containerID="da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.038295 4732 scope.go:117] "RemoveContainer" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:02:29 crc kubenswrapper[4732]: E0131 09:02:29.038478 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.055974 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c063f211d6d06c3f8b94d316d68da346d077c6e85197754ec7c77ad736fd1217\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.068228 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d790207-d357-4b47-87bf-5b505e061820\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5eaa2b856e1dedd7398d0c513f742fa5925c4748bf44eb15022adbb4b62354c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2bw2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-jnbt8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.080308 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0313609d-3507-4db5-a190-9dbf59d73e6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b303be38f64266f4e019d30e8b988945133fc76a47e305429ed048d68cdeac76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8a2d9e1c8db61a2418e981340a7cb999983d4a10e79977507abdf3bdb471939\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-phqth\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gchqk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.090746 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1add5e75-7e0e-4248-9431-256dc477beeb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7bd726ccae341dfff351d4170276b74d4b6b4f34806b7e1fb23ab517201928d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6678cf179a4e13991aa6a082456b3497a4665e0999912b8002fb84a0b6ca2ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55d09d44253b8c77894abc55d80843b79509bf7213c0bf8b4d69059d53e76e5a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.100617 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ec8fd54-25d6-41f1-9a8b-0e2823c951b7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87eb9563b2ca7420cbb70ccd32aa77f2a82c769d57eceacd99a4d94cb1c3a0d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://328c8ed55e178646a8bd3d914985f8171b6413b7b007f3bca609cfb432a227f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.113071 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bb621da-40b8-4a07-a7bd-06800007bc59\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dadc6987ef6ef2593fa8aa0ec3903dc3cfea907f73fb68dca2c141362577bb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e897d26ac7b103c21a9cb176f1a90a11098a9a2f6a4cd28f697a90ee7c9f6f9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a31769db7f40442ff410b053055e413b11d7dba7d48dfce853ca38e7b9f7595e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://407516efcd2436b964fddea3bdc778826cde289422139bc4577c9ba8c0c43675\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.116972 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.117020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.117032 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.117053 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.117065 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.137511 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58b600ed-9e96-4296-bc41-dda5d2205921\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f14a98662959fe8132486d2f648fb443a407e90160ee8c6eac7e2a3d6835b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://771af18934041722ccdbde47137c83e9d93c5e7a07b1e9b26e18354bcd2fda4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2bea5befd94ee4d3b6501273e8b8365102a35cc07b97740fc893f8d70a9d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9fd5264b5a2dcec6f8b363208d5654f2d7199660f04be408ee7ab3913d542e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194a43d5e1409abb031f1b950dfc753454b45008067d784642945ea0dfb087e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://382f1fa043705a0098e27ea34beeee0b88276aa355a7de07bc29b070949b81e2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b16d2732d6d6381045b88446a322622a7b1271f08438cb60e57c05c52ef7faf8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://273f5652d37e2be568bc388a7f9e96c66d0ce9201678b46736986815bd25d6bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.151606 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://62650e77e5971d617a77ae7461d51276fc76e3699dc4f86ebbb9b63ab0d86b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.163055 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7423f7d2a82ae276cff0f5a2730b0de7abbc5b28a8bd8983d23fe44c2e9ca3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b075971e7f28f40b90caae8b3681dd283860371a6b3f82c92c421a87a27ccb0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.178524 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.189583 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.199033 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nsgpk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"533741c8-f72a-4834-ad02-d33fc939e529\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec4f6f1a4ba8fc3975dc3ef7e86d5b17b0488de7918fe672040f115f640b8569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjff8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nsgpk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.209801 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bllbs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d87332-eaea-4007-a03e-a9a0f744563a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a371d8cbedbc18b4953520c3b135e102db92fc57883d31a8f78c510498ca41b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcdpz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bllbs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.219968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.220531 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.220651 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.220806 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.220921 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.229233 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da2d4b479ec06e475664a1f5f1c4c052f87ad374f5616ea477c7a147ca896a16\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:01:56Z\\\",\\\"message\\\":\\\"31T09:01:56Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:01:56.451406 6413 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 09:01:56.451418 6413 services_controller.go:452] Built service openshift-operator-lifecycle-manager/package-server-manager-metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451426 6413 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451433 6413 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI0131 09:01:56.451434 6413 ser\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:02:28Z\\\",\\\"message\\\":\\\"k=default: []services.lbConfig(nil)\\\\nI0131 09:02:27.497102 6910 services_controller.go:445] Built service openshift-marketplace/certified-operators LB template configs for network=default: []services.lbConfig(nil)\\\\nI0131 09:02:27.496963 6910 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0131 09:02:27.497126 6910 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:27Z is after 2025-08-24T17:21:41Z]\\\\nI0131 09:02:27.497135 6910 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI013\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:02:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jktvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-8mtkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.243851 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0131 09:01:18.226954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0131 09:01:18.227544 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1858903392/tls.crt::/tmp/serving-cert-1858903392/tls.key\\\\\\\"\\\\nI0131 09:01:23.404020 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0131 09:01:23.413560 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0131 09:01:23.413612 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0131 09:01:23.413655 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0131 09:01:23.413760 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0131 09:01:23.423398 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0131 09:01:23.423434 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423441 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0131 09:01:23.423448 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0131 09:01:23.423452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI0131 09:01:23.423448 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0131 09:01:23.423457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0131 09:01:23.423476 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0131 09:01:23.426339 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:05Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:02Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.255196 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.268212 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4mxsr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e23192f-14db-41ef-af89-4a76e325d9c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:02:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T09:02:15Z\\\",\\\"message\\\":\\\"2026-01-31T09:01:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4\\\\n2026-01-31T09:01:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_648046fe-98ce-4b13-a3af-42227ab719e4 to /host/opt/cni/bin/\\\\n2026-01-31T09:01:30Z [verbose] multus-daemon started\\\\n2026-01-31T09:01:30Z [verbose] Readiness Indicator file check\\\\n2026-01-31T09:02:15Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:02:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsnx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4mxsr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.285435 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2e6e0f4-2302-447f-a5e0-7db3d7b73cb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://038ff30d8a53b94c204f5dc0c72824fcb7f08a423eec2ce05554f23607d4dc7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0c71dc89758bff193a85ceeaa97b21ba6c93389ef9cf380bd0bac89bddddf014\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4374e8014d4e67ab8df14785ed6d1bd78b33567550758297fcc04796ea5c688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85484d53f8180c3bc83391004e8a7c6b432af09f04f597567e8ffe0b9c2e3d58\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://111b50c40cf72f105ae5c74d668673853a783050a62e89127fa666a6f8d431d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2458247a5e615b929a15e280bb9d5a65c46a991ea2cd83502a8c42b1ca0e299b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a6ff3cd5564994b28608dad1cd0848c771db950208888ac8a2213fa5dc3a168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jhxt6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t9kqf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.301367 4732 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bd29a31-1a47-40da-afc5-6c4423067083\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T09:01:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-47jtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T09:01:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7fgvm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T09:02:29Z is after 2025-08-24T17:21:41Z" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.324416 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.324478 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.324490 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.324511 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.324524 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.426803 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.426847 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.426858 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.426874 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.426885 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.529793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.529876 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.529897 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.529928 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.529945 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.538042 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 12:29:29.645079764 +0000 UTC Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.542460 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.542522 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.542614 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.542460 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:29 crc kubenswrapper[4732]: E0131 09:02:29.542716 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:29 crc kubenswrapper[4732]: E0131 09:02:29.542850 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:29 crc kubenswrapper[4732]: E0131 09:02:29.543006 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:29 crc kubenswrapper[4732]: E0131 09:02:29.543084 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.633388 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.633939 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.633950 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.633974 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.633992 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.737146 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.737200 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.737216 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.737238 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.737253 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.840364 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.840418 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.840430 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.840449 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.840461 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.942741 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.942784 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.942793 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.942809 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:29 crc kubenswrapper[4732]: I0131 09:02:29.942820 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:29Z","lastTransitionTime":"2026-01-31T09:02:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.044036 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/3.log" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.044520 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.044577 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.044824 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.044854 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.044872 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.149175 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.149239 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.149262 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.149291 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.149312 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.252461 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.252505 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.252516 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.252536 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.252550 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.355697 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.355760 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.355777 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.355802 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.355820 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.459984 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.460057 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.460081 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.460114 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.460139 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.538462 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 10:22:17.819718267 +0000 UTC Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.563351 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.563396 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.563404 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.563420 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.563432 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.670287 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.670347 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.670363 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.670385 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.670402 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.773596 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.773636 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.773646 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.773679 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.773692 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.876946 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.877019 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.877038 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.877069 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.877136 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.980825 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.980899 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.980917 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.980954 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:30 crc kubenswrapper[4732]: I0131 09:02:30.980990 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:30Z","lastTransitionTime":"2026-01-31T09:02:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.083972 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.084073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.084094 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.084128 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.084150 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:31Z","lastTransitionTime":"2026-01-31T09:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.187171 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.187219 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.187229 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.187246 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.187256 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:31Z","lastTransitionTime":"2026-01-31T09:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.290288 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.290335 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.290346 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.290361 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.290369 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:31Z","lastTransitionTime":"2026-01-31T09:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.392844 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.392918 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.392931 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.392950 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.392961 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:31Z","lastTransitionTime":"2026-01-31T09:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.496196 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.496263 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.496277 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.496812 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.496867 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:31Z","lastTransitionTime":"2026-01-31T09:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.539553 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 22:45:44.036308111 +0000 UTC Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.542118 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.542116 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.542278 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.542334 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:31 crc kubenswrapper[4732]: E0131 09:02:31.542358 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:31 crc kubenswrapper[4732]: E0131 09:02:31.542483 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:31 crc kubenswrapper[4732]: E0131 09:02:31.542542 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:31 crc kubenswrapper[4732]: E0131 09:02:31.542592 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.603610 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.603642 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.603650 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.603682 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.603695 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:31Z","lastTransitionTime":"2026-01-31T09:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.706894 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.706965 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.706990 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.707022 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.707044 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:31Z","lastTransitionTime":"2026-01-31T09:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.809008 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.809050 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.809058 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.809073 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.809081 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:31Z","lastTransitionTime":"2026-01-31T09:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.840905 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.840968 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.840991 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.841020 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.841042 4732 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T09:02:31Z","lastTransitionTime":"2026-01-31T09:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.881484 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq"] Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.882100 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.885917 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.887802 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.889543 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.889995 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.914616 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4mxsr" podStartSLOduration=64.914582304 podStartE2EDuration="1m4.914582304s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:31.912377571 +0000 UTC m=+90.218253805" watchObservedRunningTime="2026-01-31 09:02:31.914582304 +0000 UTC m=+90.220458548" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.940436 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t9kqf" podStartSLOduration=64.940404321 podStartE2EDuration="1m4.940404321s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:31.939565044 +0000 UTC m=+90.245441318" watchObservedRunningTime="2026-01-31 09:02:31.940404321 +0000 UTC m=+90.246280575" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.983273 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d4f59b7e-7610-44a3-ae37-6c095081e3e5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.983361 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d4f59b7e-7610-44a3-ae37-6c095081e3e5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.983447 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f59b7e-7610-44a3-ae37-6c095081e3e5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.983485 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d4f59b7e-7610-44a3-ae37-6c095081e3e5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:31 crc kubenswrapper[4732]: I0131 09:02:31.983519 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4f59b7e-7610-44a3-ae37-6c095081e3e5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.009988 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.009963794 podStartE2EDuration="1m9.009963794s" podCreationTimestamp="2026-01-31 09:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:31.992645146 +0000 UTC m=+90.298521370" watchObservedRunningTime="2026-01-31 09:02:32.009963794 +0000 UTC m=+90.315839998" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.081105 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podStartSLOduration=65.081083709 podStartE2EDuration="1m5.081083709s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:32.049790441 +0000 UTC m=+90.355666645" watchObservedRunningTime="2026-01-31 09:02:32.081083709 +0000 UTC m=+90.386959913" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.084546 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f59b7e-7610-44a3-ae37-6c095081e3e5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.084598 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d4f59b7e-7610-44a3-ae37-6c095081e3e5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.084638 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4f59b7e-7610-44a3-ae37-6c095081e3e5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.084715 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d4f59b7e-7610-44a3-ae37-6c095081e3e5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.084754 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d4f59b7e-7610-44a3-ae37-6c095081e3e5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.084870 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d4f59b7e-7610-44a3-ae37-6c095081e3e5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.086036 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d4f59b7e-7610-44a3-ae37-6c095081e3e5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.086295 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d4f59b7e-7610-44a3-ae37-6c095081e3e5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.101961 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4f59b7e-7610-44a3-ae37-6c095081e3e5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.109382 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d4f59b7e-7610-44a3-ae37-6c095081e3e5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xjsnq\" (UID: \"d4f59b7e-7610-44a3-ae37-6c095081e3e5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.119544 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=40.11952233 podStartE2EDuration="40.11952233s" podCreationTimestamp="2026-01-31 09:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:32.081858973 +0000 UTC m=+90.387735177" watchObservedRunningTime="2026-01-31 09:02:32.11952233 +0000 UTC m=+90.425398534" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.120785 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=69.12077372 podStartE2EDuration="1m9.12077372s" podCreationTimestamp="2026-01-31 09:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:32.119256521 +0000 UTC m=+90.425132755" watchObservedRunningTime="2026-01-31 09:02:32.12077372 +0000 UTC m=+90.426649924" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.177538 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=65.177510103 podStartE2EDuration="1m5.177510103s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:32.177299206 +0000 UTC m=+90.483175420" watchObservedRunningTime="2026-01-31 09:02:32.177510103 +0000 UTC m=+90.483386307" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.177709 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gchqk" podStartSLOduration=65.177703249 podStartE2EDuration="1m5.177703249s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:32.159656337 +0000 UTC m=+90.465532561" watchObservedRunningTime="2026-01-31 09:02:32.177703249 +0000 UTC m=+90.483579443" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.189633 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=22.18961286 podStartE2EDuration="22.18961286s" podCreationTimestamp="2026-01-31 09:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:32.189187176 +0000 UTC m=+90.495063390" watchObservedRunningTime="2026-01-31 09:02:32.18961286 +0000 UTC m=+90.495489064" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.200148 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nsgpk" podStartSLOduration=65.200119735 podStartE2EDuration="1m5.200119735s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:32.199682831 +0000 UTC m=+90.505559045" watchObservedRunningTime="2026-01-31 09:02:32.200119735 +0000 UTC m=+90.505995949" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.205203 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.235758 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bllbs" podStartSLOduration=65.235738494 podStartE2EDuration="1m5.235738494s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:32.214535508 +0000 UTC m=+90.520411712" watchObservedRunningTime="2026-01-31 09:02:32.235738494 +0000 UTC m=+90.541614698" Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.540062 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 15:28:32.573923905 +0000 UTC Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.540405 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 31 09:02:32 crc kubenswrapper[4732]: I0131 09:02:32.547092 4732 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 09:02:33 crc kubenswrapper[4732]: I0131 09:02:33.058855 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" event={"ID":"d4f59b7e-7610-44a3-ae37-6c095081e3e5","Type":"ContainerStarted","Data":"6e2403264654c050102a6e3e43b9cbc23944d62891391025f3d1bf30dbbc5f7f"} Jan 31 09:02:33 crc kubenswrapper[4732]: I0131 09:02:33.058965 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" event={"ID":"d4f59b7e-7610-44a3-ae37-6c095081e3e5","Type":"ContainerStarted","Data":"9de86783807d0f864b334778fdf819f6f83e3fbfcf1762d7db3114c467445704"} Jan 31 09:02:33 crc kubenswrapper[4732]: I0131 09:02:33.085129 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xjsnq" podStartSLOduration=66.085102261 podStartE2EDuration="1m6.085102261s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:02:33.080649385 +0000 UTC m=+91.386525589" watchObservedRunningTime="2026-01-31 09:02:33.085102261 +0000 UTC m=+91.390978475" Jan 31 09:02:33 crc kubenswrapper[4732]: I0131 09:02:33.542289 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:33 crc kubenswrapper[4732]: E0131 09:02:33.542444 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:33 crc kubenswrapper[4732]: I0131 09:02:33.542473 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:33 crc kubenswrapper[4732]: I0131 09:02:33.542513 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:33 crc kubenswrapper[4732]: E0131 09:02:33.542606 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:33 crc kubenswrapper[4732]: E0131 09:02:33.542760 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:33 crc kubenswrapper[4732]: I0131 09:02:33.542812 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:33 crc kubenswrapper[4732]: E0131 09:02:33.542936 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:35 crc kubenswrapper[4732]: I0131 09:02:35.542173 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:35 crc kubenswrapper[4732]: I0131 09:02:35.542269 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:35 crc kubenswrapper[4732]: E0131 09:02:35.542322 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:35 crc kubenswrapper[4732]: I0131 09:02:35.542381 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:35 crc kubenswrapper[4732]: E0131 09:02:35.542612 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:35 crc kubenswrapper[4732]: I0131 09:02:35.543035 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:35 crc kubenswrapper[4732]: E0131 09:02:35.543184 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:35 crc kubenswrapper[4732]: E0131 09:02:35.543496 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:37 crc kubenswrapper[4732]: I0131 09:02:37.542139 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:37 crc kubenswrapper[4732]: I0131 09:02:37.542139 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:37 crc kubenswrapper[4732]: E0131 09:02:37.542334 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:37 crc kubenswrapper[4732]: I0131 09:02:37.542269 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:37 crc kubenswrapper[4732]: E0131 09:02:37.542506 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:37 crc kubenswrapper[4732]: E0131 09:02:37.542572 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:37 crc kubenswrapper[4732]: I0131 09:02:37.542824 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:37 crc kubenswrapper[4732]: E0131 09:02:37.542940 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:39 crc kubenswrapper[4732]: I0131 09:02:39.542461 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:39 crc kubenswrapper[4732]: I0131 09:02:39.542536 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:39 crc kubenswrapper[4732]: I0131 09:02:39.542565 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:39 crc kubenswrapper[4732]: I0131 09:02:39.542628 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:39 crc kubenswrapper[4732]: E0131 09:02:39.542892 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:39 crc kubenswrapper[4732]: E0131 09:02:39.542942 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:39 crc kubenswrapper[4732]: E0131 09:02:39.543087 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:39 crc kubenswrapper[4732]: E0131 09:02:39.543237 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:41 crc kubenswrapper[4732]: I0131 09:02:41.542388 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:41 crc kubenswrapper[4732]: I0131 09:02:41.542490 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:41 crc kubenswrapper[4732]: E0131 09:02:41.542540 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:41 crc kubenswrapper[4732]: I0131 09:02:41.542747 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:41 crc kubenswrapper[4732]: E0131 09:02:41.542742 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:41 crc kubenswrapper[4732]: I0131 09:02:41.542784 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:41 crc kubenswrapper[4732]: E0131 09:02:41.542825 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:41 crc kubenswrapper[4732]: E0131 09:02:41.542863 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:43 crc kubenswrapper[4732]: I0131 09:02:43.541885 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:43 crc kubenswrapper[4732]: I0131 09:02:43.541937 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:43 crc kubenswrapper[4732]: I0131 09:02:43.541898 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:43 crc kubenswrapper[4732]: I0131 09:02:43.542015 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:43 crc kubenswrapper[4732]: E0131 09:02:43.542362 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:43 crc kubenswrapper[4732]: E0131 09:02:43.542532 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:43 crc kubenswrapper[4732]: E0131 09:02:43.542599 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:43 crc kubenswrapper[4732]: E0131 09:02:43.542708 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:44 crc kubenswrapper[4732]: I0131 09:02:44.543598 4732 scope.go:117] "RemoveContainer" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:02:44 crc kubenswrapper[4732]: E0131 09:02:44.543894 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" Jan 31 09:02:45 crc kubenswrapper[4732]: I0131 09:02:45.541698 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:45 crc kubenswrapper[4732]: I0131 09:02:45.541831 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:45 crc kubenswrapper[4732]: E0131 09:02:45.541928 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:45 crc kubenswrapper[4732]: I0131 09:02:45.541739 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:45 crc kubenswrapper[4732]: E0131 09:02:45.541991 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:45 crc kubenswrapper[4732]: I0131 09:02:45.541739 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:45 crc kubenswrapper[4732]: E0131 09:02:45.542046 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:45 crc kubenswrapper[4732]: E0131 09:02:45.542075 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:45 crc kubenswrapper[4732]: I0131 09:02:45.738406 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:45 crc kubenswrapper[4732]: E0131 09:02:45.738622 4732 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:02:45 crc kubenswrapper[4732]: E0131 09:02:45.738751 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs podName:3bd29a31-1a47-40da-afc5-6c4423067083 nodeName:}" failed. No retries permitted until 2026-01-31 09:03:49.738724886 +0000 UTC m=+168.044601090 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs") pod "network-metrics-daemon-7fgvm" (UID: "3bd29a31-1a47-40da-afc5-6c4423067083") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 09:02:47 crc kubenswrapper[4732]: I0131 09:02:47.541874 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:47 crc kubenswrapper[4732]: I0131 09:02:47.541958 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:47 crc kubenswrapper[4732]: I0131 09:02:47.541916 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:47 crc kubenswrapper[4732]: I0131 09:02:47.541892 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:47 crc kubenswrapper[4732]: E0131 09:02:47.542087 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:47 crc kubenswrapper[4732]: E0131 09:02:47.542312 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:47 crc kubenswrapper[4732]: E0131 09:02:47.542506 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:47 crc kubenswrapper[4732]: E0131 09:02:47.542637 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:49 crc kubenswrapper[4732]: I0131 09:02:49.542816 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:49 crc kubenswrapper[4732]: I0131 09:02:49.542869 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:49 crc kubenswrapper[4732]: I0131 09:02:49.542912 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:49 crc kubenswrapper[4732]: E0131 09:02:49.542971 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:49 crc kubenswrapper[4732]: I0131 09:02:49.542835 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:49 crc kubenswrapper[4732]: E0131 09:02:49.543138 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:49 crc kubenswrapper[4732]: E0131 09:02:49.543291 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:49 crc kubenswrapper[4732]: E0131 09:02:49.543329 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:51 crc kubenswrapper[4732]: I0131 09:02:51.542852 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:51 crc kubenswrapper[4732]: I0131 09:02:51.542901 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:51 crc kubenswrapper[4732]: I0131 09:02:51.542987 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:51 crc kubenswrapper[4732]: I0131 09:02:51.543041 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:51 crc kubenswrapper[4732]: E0131 09:02:51.543099 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:51 crc kubenswrapper[4732]: E0131 09:02:51.543189 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:51 crc kubenswrapper[4732]: E0131 09:02:51.543413 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:51 crc kubenswrapper[4732]: E0131 09:02:51.543556 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:53 crc kubenswrapper[4732]: I0131 09:02:53.542546 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:53 crc kubenswrapper[4732]: I0131 09:02:53.542641 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:53 crc kubenswrapper[4732]: I0131 09:02:53.542641 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:53 crc kubenswrapper[4732]: E0131 09:02:53.542858 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:53 crc kubenswrapper[4732]: I0131 09:02:53.542896 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:53 crc kubenswrapper[4732]: E0131 09:02:53.543210 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:53 crc kubenswrapper[4732]: E0131 09:02:53.543310 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:53 crc kubenswrapper[4732]: E0131 09:02:53.543406 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:55 crc kubenswrapper[4732]: I0131 09:02:55.541840 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:55 crc kubenswrapper[4732]: I0131 09:02:55.541893 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:55 crc kubenswrapper[4732]: I0131 09:02:55.541911 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:55 crc kubenswrapper[4732]: I0131 09:02:55.541988 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:55 crc kubenswrapper[4732]: E0131 09:02:55.542093 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:55 crc kubenswrapper[4732]: E0131 09:02:55.542253 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:55 crc kubenswrapper[4732]: E0131 09:02:55.542428 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:55 crc kubenswrapper[4732]: E0131 09:02:55.542650 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:57 crc kubenswrapper[4732]: I0131 09:02:57.542562 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:57 crc kubenswrapper[4732]: I0131 09:02:57.542613 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:57 crc kubenswrapper[4732]: I0131 09:02:57.542623 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:57 crc kubenswrapper[4732]: I0131 09:02:57.542562 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:57 crc kubenswrapper[4732]: E0131 09:02:57.542736 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:57 crc kubenswrapper[4732]: E0131 09:02:57.542885 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:57 crc kubenswrapper[4732]: E0131 09:02:57.542992 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:57 crc kubenswrapper[4732]: E0131 09:02:57.543252 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:59 crc kubenswrapper[4732]: I0131 09:02:59.542054 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:02:59 crc kubenswrapper[4732]: E0131 09:02:59.542196 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:02:59 crc kubenswrapper[4732]: I0131 09:02:59.542172 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:02:59 crc kubenswrapper[4732]: I0131 09:02:59.542336 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:02:59 crc kubenswrapper[4732]: I0131 09:02:59.542443 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:02:59 crc kubenswrapper[4732]: E0131 09:02:59.542441 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:02:59 crc kubenswrapper[4732]: E0131 09:02:59.542552 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:02:59 crc kubenswrapper[4732]: E0131 09:02:59.542648 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:02:59 crc kubenswrapper[4732]: I0131 09:02:59.544637 4732 scope.go:117] "RemoveContainer" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:02:59 crc kubenswrapper[4732]: E0131 09:02:59.544981 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-8mtkt_openshift-ovn-kubernetes(82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" Jan 31 09:03:01 crc kubenswrapper[4732]: I0131 09:03:01.542655 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:01 crc kubenswrapper[4732]: I0131 09:03:01.542703 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:01 crc kubenswrapper[4732]: I0131 09:03:01.542743 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:01 crc kubenswrapper[4732]: E0131 09:03:01.543356 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:01 crc kubenswrapper[4732]: I0131 09:03:01.542800 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:01 crc kubenswrapper[4732]: E0131 09:03:01.543159 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:01 crc kubenswrapper[4732]: E0131 09:03:01.543494 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:01 crc kubenswrapper[4732]: E0131 09:03:01.543635 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:02 crc kubenswrapper[4732]: I0131 09:03:02.159415 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/1.log" Jan 31 09:03:02 crc kubenswrapper[4732]: I0131 09:03:02.160004 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/0.log" Jan 31 09:03:02 crc kubenswrapper[4732]: I0131 09:03:02.160067 4732 generic.go:334] "Generic (PLEG): container finished" podID="8e23192f-14db-41ef-af89-4a76e325d9c1" containerID="456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617" exitCode=1 Jan 31 09:03:02 crc kubenswrapper[4732]: I0131 09:03:02.160104 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mxsr" event={"ID":"8e23192f-14db-41ef-af89-4a76e325d9c1","Type":"ContainerDied","Data":"456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617"} Jan 31 09:03:02 crc kubenswrapper[4732]: I0131 09:03:02.160149 4732 scope.go:117] "RemoveContainer" containerID="e6af50b44afec0e5f0757efda1ff5567845c4482609a82bafcbf74947c657c56" Jan 31 09:03:02 crc kubenswrapper[4732]: I0131 09:03:02.160720 4732 scope.go:117] "RemoveContainer" containerID="456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617" Jan 31 09:03:02 crc kubenswrapper[4732]: E0131 09:03:02.160918 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-4mxsr_openshift-multus(8e23192f-14db-41ef-af89-4a76e325d9c1)\"" pod="openshift-multus/multus-4mxsr" podUID="8e23192f-14db-41ef-af89-4a76e325d9c1" Jan 31 09:03:02 crc kubenswrapper[4732]: E0131 09:03:02.518795 4732 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 31 09:03:02 crc kubenswrapper[4732]: E0131 09:03:02.692310 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 09:03:03 crc kubenswrapper[4732]: I0131 09:03:03.170265 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/1.log" Jan 31 09:03:03 crc kubenswrapper[4732]: I0131 09:03:03.543270 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:03 crc kubenswrapper[4732]: I0131 09:03:03.543340 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:03 crc kubenswrapper[4732]: E0131 09:03:03.543500 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:03 crc kubenswrapper[4732]: I0131 09:03:03.543897 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:03 crc kubenswrapper[4732]: I0131 09:03:03.543921 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:03 crc kubenswrapper[4732]: E0131 09:03:03.544024 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:03 crc kubenswrapper[4732]: E0131 09:03:03.544257 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:03 crc kubenswrapper[4732]: E0131 09:03:03.544396 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:05 crc kubenswrapper[4732]: I0131 09:03:05.542460 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:05 crc kubenswrapper[4732]: I0131 09:03:05.542560 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:05 crc kubenswrapper[4732]: E0131 09:03:05.543609 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:05 crc kubenswrapper[4732]: I0131 09:03:05.542615 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:05 crc kubenswrapper[4732]: E0131 09:03:05.543787 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:05 crc kubenswrapper[4732]: I0131 09:03:05.542460 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:05 crc kubenswrapper[4732]: E0131 09:03:05.543944 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:05 crc kubenswrapper[4732]: E0131 09:03:05.544099 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:07 crc kubenswrapper[4732]: I0131 09:03:07.542119 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:07 crc kubenswrapper[4732]: I0131 09:03:07.542197 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:07 crc kubenswrapper[4732]: I0131 09:03:07.542910 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:07 crc kubenswrapper[4732]: E0131 09:03:07.543130 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:07 crc kubenswrapper[4732]: I0131 09:03:07.543207 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:07 crc kubenswrapper[4732]: E0131 09:03:07.543350 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:07 crc kubenswrapper[4732]: E0131 09:03:07.543400 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:07 crc kubenswrapper[4732]: E0131 09:03:07.543488 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:07 crc kubenswrapper[4732]: E0131 09:03:07.694218 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 09:03:09 crc kubenswrapper[4732]: I0131 09:03:09.541705 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:09 crc kubenswrapper[4732]: I0131 09:03:09.541707 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:09 crc kubenswrapper[4732]: E0131 09:03:09.542172 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:09 crc kubenswrapper[4732]: I0131 09:03:09.541743 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:09 crc kubenswrapper[4732]: I0131 09:03:09.541738 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:09 crc kubenswrapper[4732]: E0131 09:03:09.542241 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:09 crc kubenswrapper[4732]: E0131 09:03:09.542408 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:09 crc kubenswrapper[4732]: E0131 09:03:09.542576 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:11 crc kubenswrapper[4732]: I0131 09:03:11.541585 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:11 crc kubenswrapper[4732]: I0131 09:03:11.541687 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:11 crc kubenswrapper[4732]: E0131 09:03:11.541765 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:11 crc kubenswrapper[4732]: E0131 09:03:11.541828 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:11 crc kubenswrapper[4732]: I0131 09:03:11.541704 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:11 crc kubenswrapper[4732]: I0131 09:03:11.541973 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:11 crc kubenswrapper[4732]: E0131 09:03:11.542082 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:11 crc kubenswrapper[4732]: E0131 09:03:11.542188 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:12 crc kubenswrapper[4732]: I0131 09:03:12.543801 4732 scope.go:117] "RemoveContainer" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:03:12 crc kubenswrapper[4732]: E0131 09:03:12.695918 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 09:03:13 crc kubenswrapper[4732]: I0131 09:03:13.210124 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/3.log" Jan 31 09:03:13 crc kubenswrapper[4732]: I0131 09:03:13.214222 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerStarted","Data":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} Jan 31 09:03:13 crc kubenswrapper[4732]: I0131 09:03:13.214818 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:03:13 crc kubenswrapper[4732]: I0131 09:03:13.254960 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podStartSLOduration=106.254926671 podStartE2EDuration="1m46.254926671s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:13.251969774 +0000 UTC m=+131.557845988" watchObservedRunningTime="2026-01-31 09:03:13.254926671 +0000 UTC m=+131.560802915" Jan 31 09:03:13 crc kubenswrapper[4732]: I0131 09:03:13.541140 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7fgvm"] Jan 31 09:03:13 crc kubenswrapper[4732]: I0131 09:03:13.541637 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:13 crc kubenswrapper[4732]: I0131 09:03:13.541797 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:13 crc kubenswrapper[4732]: E0131 09:03:13.541795 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:13 crc kubenswrapper[4732]: E0131 09:03:13.541893 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:13 crc kubenswrapper[4732]: I0131 09:03:13.541888 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:13 crc kubenswrapper[4732]: I0131 09:03:13.541941 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:13 crc kubenswrapper[4732]: E0131 09:03:13.542084 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:13 crc kubenswrapper[4732]: E0131 09:03:13.542183 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:14 crc kubenswrapper[4732]: I0131 09:03:14.217518 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:14 crc kubenswrapper[4732]: E0131 09:03:14.217845 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:15 crc kubenswrapper[4732]: I0131 09:03:15.542299 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:15 crc kubenswrapper[4732]: I0131 09:03:15.542325 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:15 crc kubenswrapper[4732]: E0131 09:03:15.543457 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:15 crc kubenswrapper[4732]: I0131 09:03:15.542356 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:15 crc kubenswrapper[4732]: E0131 09:03:15.543502 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:15 crc kubenswrapper[4732]: E0131 09:03:15.543920 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:16 crc kubenswrapper[4732]: I0131 09:03:16.542384 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:16 crc kubenswrapper[4732]: I0131 09:03:16.543036 4732 scope.go:117] "RemoveContainer" containerID="456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617" Jan 31 09:03:16 crc kubenswrapper[4732]: E0131 09:03:16.543945 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:17 crc kubenswrapper[4732]: I0131 09:03:17.229917 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/1.log" Jan 31 09:03:17 crc kubenswrapper[4732]: I0131 09:03:17.230249 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mxsr" event={"ID":"8e23192f-14db-41ef-af89-4a76e325d9c1","Type":"ContainerStarted","Data":"98e5c23e9a8bde55626defc76f20f8510954c4ef79d762950e0790a7de4dce4f"} Jan 31 09:03:17 crc kubenswrapper[4732]: I0131 09:03:17.541691 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:17 crc kubenswrapper[4732]: I0131 09:03:17.541725 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:17 crc kubenswrapper[4732]: E0131 09:03:17.541868 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:17 crc kubenswrapper[4732]: E0131 09:03:17.541930 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:17 crc kubenswrapper[4732]: I0131 09:03:17.542223 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:17 crc kubenswrapper[4732]: E0131 09:03:17.542464 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:17 crc kubenswrapper[4732]: E0131 09:03:17.697437 4732 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 09:03:18 crc kubenswrapper[4732]: I0131 09:03:18.542541 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:18 crc kubenswrapper[4732]: E0131 09:03:18.542752 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:19 crc kubenswrapper[4732]: I0131 09:03:19.542199 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:19 crc kubenswrapper[4732]: I0131 09:03:19.542291 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:19 crc kubenswrapper[4732]: E0131 09:03:19.542395 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:19 crc kubenswrapper[4732]: E0131 09:03:19.542515 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:19 crc kubenswrapper[4732]: I0131 09:03:19.542225 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:19 crc kubenswrapper[4732]: E0131 09:03:19.542881 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:20 crc kubenswrapper[4732]: I0131 09:03:20.542316 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:20 crc kubenswrapper[4732]: E0131 09:03:20.542569 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:21 crc kubenswrapper[4732]: I0131 09:03:21.541793 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:21 crc kubenswrapper[4732]: I0131 09:03:21.541838 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:21 crc kubenswrapper[4732]: E0131 09:03:21.541977 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 09:03:21 crc kubenswrapper[4732]: E0131 09:03:21.542227 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 09:03:21 crc kubenswrapper[4732]: I0131 09:03:21.542293 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:21 crc kubenswrapper[4732]: E0131 09:03:21.542519 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.542226 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:22 crc kubenswrapper[4732]: E0131 09:03:22.543110 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7fgvm" podUID="3bd29a31-1a47-40da-afc5-6c4423067083" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.802734 4732 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.856626 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tg4xc"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.857426 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.858083 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.858853 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.863507 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.863938 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pxn6w"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.864187 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hzs92"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.864499 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.866035 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.866220 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.866851 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.867040 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.867327 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.867582 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.867643 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.867981 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.868072 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.868334 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.868528 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.869133 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.869425 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.869622 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.869742 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.874513 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-54nxd"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.874797 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.875478 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.875844 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.876442 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xprfh"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.877301 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.878783 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.880217 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.883055 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.883280 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.889037 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.889453 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.890522 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.891494 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.891979 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.892995 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.915586 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-76d6v"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.916129 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.916698 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.917162 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.917302 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.917409 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.917790 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.918290 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.918411 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.918477 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.918551 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.918702 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.918816 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.918909 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.918935 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.918989 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.919279 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.919540 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.919740 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.919819 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-rt2jr"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.920300 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt2jr" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.920391 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.920474 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.920604 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.920804 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.921098 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.921391 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.921707 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.922330 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.922648 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923057 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923251 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923371 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923424 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923550 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923718 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923818 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923892 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923963 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.924060 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.923320 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.924648 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.926561 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.926716 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.935047 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.935248 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.935887 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.936011 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.937006 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.937223 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.937833 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8t8ks"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.938233 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.938528 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.938732 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.938979 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.939105 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8t6l"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.939747 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.939165 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.939242 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.939839 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.939745 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.940036 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.941391 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.945218 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.950456 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.950880 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-machine-approver-tls\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.950918 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/81b523ca-b564-45d4-bad5-f7e236f2e6d0-node-pullsecrets\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.950946 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6377a401-b10b-455a-8906-f6706302b91f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.950968 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r24m4\" (UniqueName: \"kubernetes.io/projected/922314ab-f199-4117-acab-bc641c1cda57-kube-api-access-r24m4\") pod \"openshift-apiserver-operator-796bbdcf4f-ncqs4\" (UID: \"922314ab-f199-4117-acab-bc641c1cda57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.950990 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6377a401-b10b-455a-8906-f6706302b91f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951011 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-config\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951044 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvpv9\" (UniqueName: \"kubernetes.io/projected/edb14eaf-7738-4139-9b1b-9557e7e37ffc-kube-api-access-hvpv9\") pod \"cluster-samples-operator-665b6dd947-vkrgj\" (UID: \"edb14eaf-7738-4139-9b1b-9557e7e37ffc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951068 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951089 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc5nb\" (UniqueName: \"kubernetes.io/projected/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-kube-api-access-lc5nb\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951110 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-images\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951132 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf682\" (UniqueName: \"kubernetes.io/projected/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-kube-api-access-xf682\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951166 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-auth-proxy-config\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951189 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/499830ff-8add-4caf-b469-d1cbde569fb7-trusted-ca\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951211 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjwq\" (UniqueName: \"kubernetes.io/projected/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-kube-api-access-lxjwq\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951232 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/922314ab-f199-4117-acab-bc641c1cda57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ncqs4\" (UID: \"922314ab-f199-4117-acab-bc641c1cda57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951254 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch8m2\" (UniqueName: \"kubernetes.io/projected/576b5a44-3c4c-4905-8d89-caed3b1eb43f-kube-api-access-ch8m2\") pod \"openshift-config-operator-7777fb866f-hzs92\" (UID: \"576b5a44-3c4c-4905-8d89-caed3b1eb43f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951277 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951723 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-config\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951758 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-config\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951785 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6377a401-b10b-455a-8906-f6706302b91f-audit-policies\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951811 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38051ff1-1715-41dd-aa28-53aea32c8e05-config\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951835 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6377a401-b10b-455a-8906-f6706302b91f-encryption-config\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951858 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-client-ca\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951879 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhhgt\" (UniqueName: \"kubernetes.io/projected/541ea3c2-891c-4c3e-81fd-9d340112c62b-kube-api-access-jhhgt\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951901 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdzrp\" (UniqueName: \"kubernetes.io/projected/81e1781e-a935-4f3f-b2aa-9a0807f43c73-kube-api-access-wdzrp\") pod \"downloads-7954f5f757-rt2jr\" (UID: \"81e1781e-a935-4f3f-b2aa-9a0807f43c73\") " pod="openshift-console/downloads-7954f5f757-rt2jr" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951924 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-audit\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951943 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9mtk\" (UniqueName: \"kubernetes.io/projected/81b523ca-b564-45d4-bad5-f7e236f2e6d0-kube-api-access-s9mtk\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951963 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/576b5a44-3c4c-4905-8d89-caed3b1eb43f-serving-cert\") pod \"openshift-config-operator-7777fb866f-hzs92\" (UID: \"576b5a44-3c4c-4905-8d89-caed3b1eb43f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.951997 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-etcd-serving-ca\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952021 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/81b523ca-b564-45d4-bad5-f7e236f2e6d0-encryption-config\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952041 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6377a401-b10b-455a-8906-f6706302b91f-serving-cert\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952072 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952094 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38051ff1-1715-41dd-aa28-53aea32c8e05-serving-cert\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952116 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/576b5a44-3c4c-4905-8d89-caed3b1eb43f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hzs92\" (UID: \"576b5a44-3c4c-4905-8d89-caed3b1eb43f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952137 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81b523ca-b564-45d4-bad5-f7e236f2e6d0-serving-cert\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952158 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6377a401-b10b-455a-8906-f6706302b91f-audit-dir\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952182 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bws5c\" (UniqueName: \"kubernetes.io/projected/219a04b6-e7bd-4138-bcc7-4f650537aa24-kube-api-access-bws5c\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.953788 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-client-ca\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.953825 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p24t7\" (UniqueName: \"kubernetes.io/projected/38051ff1-1715-41dd-aa28-53aea32c8e05-kube-api-access-p24t7\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.953846 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6377a401-b10b-455a-8906-f6706302b91f-etcd-client\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.953882 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/922314ab-f199-4117-acab-bc641c1cda57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ncqs4\" (UID: \"922314ab-f199-4117-acab-bc641c1cda57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.953903 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.953921 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81b523ca-b564-45d4-bad5-f7e236f2e6d0-audit-dir\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954175 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-config\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954204 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541ea3c2-891c-4c3e-81fd-9d340112c62b-serving-cert\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954225 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38051ff1-1715-41dd-aa28-53aea32c8e05-service-ca-bundle\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954247 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954765 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hzs92"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952040 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954910 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a04b6-e7bd-4138-bcc7-4f650537aa24-serving-cert\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954974 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954993 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952316 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.952884 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955008 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/81b523ca-b564-45d4-bad5-f7e236f2e6d0-etcd-client\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955196 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38051ff1-1715-41dd-aa28-53aea32c8e05-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955216 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9ndf\" (UniqueName: \"kubernetes.io/projected/6377a401-b10b-455a-8906-f6706302b91f-kube-api-access-v9ndf\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954268 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955249 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499830ff-8add-4caf-b469-d1cbde569fb7-serving-cert\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955275 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-image-import-ca\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954363 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955297 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-config\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955327 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edb14eaf-7738-4139-9b1b-9557e7e37ffc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vkrgj\" (UID: \"edb14eaf-7738-4139-9b1b-9557e7e37ffc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954625 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955378 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.954755 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955442 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499830ff-8add-4caf-b469-d1cbde569fb7-config\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.955489 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr859\" (UniqueName: \"kubernetes.io/projected/499830ff-8add-4caf-b469-d1cbde569fb7-kube-api-access-jr859\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.956725 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.957332 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tg4xc"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.969471 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.972269 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.972702 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.974558 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.975195 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.975427 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.976691 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.977123 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.977758 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pxn6w"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.974886 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.989526 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.989903 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.991974 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.996105 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk"] Jan 31 09:03:22 crc kubenswrapper[4732]: I0131 09:03:22.996351 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.000760 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.003515 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.009202 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-99dtb"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.009992 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.011643 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.012377 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.012529 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vpnm5"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.012912 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.013086 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.017341 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fsss9"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.019076 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.019468 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.019542 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.020405 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-76d6v"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.020635 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.020750 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.021174 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.021628 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.023756 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.024447 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.024965 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-54nxd"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.026164 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8t6l"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.028681 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-h5q9f"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.029191 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.030590 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.031092 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.031707 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.032009 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.033500 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.033902 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.069790 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kgqfp"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070162 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r24m4\" (UniqueName: \"kubernetes.io/projected/922314ab-f199-4117-acab-bc641c1cda57-kube-api-access-r24m4\") pod \"openshift-apiserver-operator-796bbdcf4f-ncqs4\" (UID: \"922314ab-f199-4117-acab-bc641c1cda57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070196 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6377a401-b10b-455a-8906-f6706302b91f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070214 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-config\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070243 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070265 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvpv9\" (UniqueName: \"kubernetes.io/projected/edb14eaf-7738-4139-9b1b-9557e7e37ffc-kube-api-access-hvpv9\") pod \"cluster-samples-operator-665b6dd947-vkrgj\" (UID: \"edb14eaf-7738-4139-9b1b-9557e7e37ffc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070284 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070301 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b35d0df8-53f0-4787-b0b4-c93be28f0127-console-serving-cert\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070317 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-oauth-serving-cert\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070335 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc5nb\" (UniqueName: \"kubernetes.io/projected/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-kube-api-access-lc5nb\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070373 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.070825 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071050 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071623 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-config\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071692 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071712 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/639dacb9-2ea3-49d2-b5c4-996992c8e16a-etcd-ca\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071728 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/639dacb9-2ea3-49d2-b5c4-996992c8e16a-serving-cert\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071748 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf682\" (UniqueName: \"kubernetes.io/projected/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-kube-api-access-xf682\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071767 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-images\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071795 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-auth-proxy-config\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071811 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071829 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071849 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz728\" (UniqueName: \"kubernetes.io/projected/b35d0df8-53f0-4787-b0b4-c93be28f0127-kube-api-access-sz728\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071873 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/499830ff-8add-4caf-b469-d1cbde569fb7-trusted-ca\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071902 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjwq\" (UniqueName: \"kubernetes.io/projected/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-kube-api-access-lxjwq\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071924 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-trusted-ca-bundle\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071947 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/922314ab-f199-4117-acab-bc641c1cda57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ncqs4\" (UID: \"922314ab-f199-4117-acab-bc641c1cda57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071965 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/639dacb9-2ea3-49d2-b5c4-996992c8e16a-config\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.071983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-config\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072003 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch8m2\" (UniqueName: \"kubernetes.io/projected/576b5a44-3c4c-4905-8d89-caed3b1eb43f-kube-api-access-ch8m2\") pod \"openshift-config-operator-7777fb866f-hzs92\" (UID: \"576b5a44-3c4c-4905-8d89-caed3b1eb43f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072021 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072036 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-config\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072053 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6377a401-b10b-455a-8906-f6706302b91f-audit-policies\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072079 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-policies\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072095 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-dir\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072110 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072130 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072150 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38051ff1-1715-41dd-aa28-53aea32c8e05-config\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072164 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6377a401-b10b-455a-8906-f6706302b91f-encryption-config\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072179 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-client-ca\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072194 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072211 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-audit\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072228 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9mtk\" (UniqueName: \"kubernetes.io/projected/81b523ca-b564-45d4-bad5-f7e236f2e6d0-kube-api-access-s9mtk\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072247 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhhgt\" (UniqueName: \"kubernetes.io/projected/541ea3c2-891c-4c3e-81fd-9d340112c62b-kube-api-access-jhhgt\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdzrp\" (UniqueName: \"kubernetes.io/projected/81e1781e-a935-4f3f-b2aa-9a0807f43c73-kube-api-access-wdzrp\") pod \"downloads-7954f5f757-rt2jr\" (UID: \"81e1781e-a935-4f3f-b2aa-9a0807f43c73\") " pod="openshift-console/downloads-7954f5f757-rt2jr" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072277 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba3ef6a-8439-4317-bae9-01618d78512a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dv6hv\" (UID: \"cba3ef6a-8439-4317-bae9-01618d78512a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072304 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/576b5a44-3c4c-4905-8d89-caed3b1eb43f-serving-cert\") pod \"openshift-config-operator-7777fb866f-hzs92\" (UID: \"576b5a44-3c4c-4905-8d89-caed3b1eb43f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072320 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072334 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b35d0df8-53f0-4787-b0b4-c93be28f0127-console-oauth-config\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072357 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-etcd-serving-ca\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072371 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/81b523ca-b564-45d4-bad5-f7e236f2e6d0-encryption-config\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072387 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6377a401-b10b-455a-8906-f6706302b91f-serving-cert\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072402 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/639dacb9-2ea3-49d2-b5c4-996992c8e16a-etcd-client\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072422 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072436 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38051ff1-1715-41dd-aa28-53aea32c8e05-serving-cert\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072451 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cba3ef6a-8439-4317-bae9-01618d78512a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dv6hv\" (UID: \"cba3ef6a-8439-4317-bae9-01618d78512a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/576b5a44-3c4c-4905-8d89-caed3b1eb43f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hzs92\" (UID: \"576b5a44-3c4c-4905-8d89-caed3b1eb43f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072486 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81b523ca-b564-45d4-bad5-f7e236f2e6d0-serving-cert\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072501 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6377a401-b10b-455a-8906-f6706302b91f-audit-dir\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072517 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072537 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bws5c\" (UniqueName: \"kubernetes.io/projected/219a04b6-e7bd-4138-bcc7-4f650537aa24-kube-api-access-bws5c\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072553 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqgz8\" (UniqueName: \"kubernetes.io/projected/8cc29c02-baeb-4f46-92d6-684343509ae1-kube-api-access-vqgz8\") pod \"dns-operator-744455d44c-vpnm5\" (UID: \"8cc29c02-baeb-4f46-92d6-684343509ae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072569 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/639dacb9-2ea3-49d2-b5c4-996992c8e16a-etcd-service-ca\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072584 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxdbz\" (UniqueName: \"kubernetes.io/projected/639dacb9-2ea3-49d2-b5c4-996992c8e16a-kube-api-access-qxdbz\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072609 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/922314ab-f199-4117-acab-bc641c1cda57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ncqs4\" (UID: \"922314ab-f199-4117-acab-bc641c1cda57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072624 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072638 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81b523ca-b564-45d4-bad5-f7e236f2e6d0-audit-dir\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-client-ca\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072689 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p24t7\" (UniqueName: \"kubernetes.io/projected/38051ff1-1715-41dd-aa28-53aea32c8e05-kube-api-access-p24t7\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072708 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6377a401-b10b-455a-8906-f6706302b91f-etcd-client\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072725 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvc79\" (UniqueName: \"kubernetes.io/projected/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-kube-api-access-nvc79\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072745 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-config\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072760 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541ea3c2-891c-4c3e-81fd-9d340112c62b-serving-cert\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072775 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-metrics-tls\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072790 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-console-config\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072808 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a04b6-e7bd-4138-bcc7-4f650537aa24-serving-cert\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072824 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38051ff1-1715-41dd-aa28-53aea32c8e05-service-ca-bundle\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072840 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072857 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072874 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cc29c02-baeb-4f46-92d6-684343509ae1-metrics-tls\") pod \"dns-operator-744455d44c-vpnm5\" (UID: \"8cc29c02-baeb-4f46-92d6-684343509ae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072890 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/81b523ca-b564-45d4-bad5-f7e236f2e6d0-etcd-client\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072906 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38051ff1-1715-41dd-aa28-53aea32c8e05-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072923 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9ndf\" (UniqueName: \"kubernetes.io/projected/6377a401-b10b-455a-8906-f6706302b91f-kube-api-access-v9ndf\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072940 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072961 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lsmw\" (UniqueName: \"kubernetes.io/projected/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-kube-api-access-9lsmw\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.072983 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-service-ca\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073002 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499830ff-8add-4caf-b469-d1cbde569fb7-serving-cert\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073019 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-image-import-ca\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073035 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba3ef6a-8439-4317-bae9-01618d78512a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dv6hv\" (UID: \"cba3ef6a-8439-4317-bae9-01618d78512a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073056 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edb14eaf-7738-4139-9b1b-9557e7e37ffc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vkrgj\" (UID: \"edb14eaf-7738-4139-9b1b-9557e7e37ffc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073072 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-config\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073097 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-trusted-ca\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073117 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499830ff-8add-4caf-b469-d1cbde569fb7-config\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073133 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr859\" (UniqueName: \"kubernetes.io/projected/499830ff-8add-4caf-b469-d1cbde569fb7-kube-api-access-jr859\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073150 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073171 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-machine-approver-tls\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073186 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/81b523ca-b564-45d4-bad5-f7e236f2e6d0-node-pullsecrets\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073202 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6377a401-b10b-455a-8906-f6706302b91f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.073218 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.078750 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.080005 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-images\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.080066 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6377a401-b10b-455a-8906-f6706302b91f-audit-policies\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.081026 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-auth-proxy-config\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.081120 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38051ff1-1715-41dd-aa28-53aea32c8e05-config\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.082584 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8t8ks"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.083007 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-client-ca\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.084365 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.085196 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-etcd-serving-ca\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.087105 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6377a401-b10b-455a-8906-f6706302b91f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.089062 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38051ff1-1715-41dd-aa28-53aea32c8e05-serving-cert\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.089455 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/81b523ca-b564-45d4-bad5-f7e236f2e6d0-audit-dir\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.091935 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/81b523ca-b564-45d4-bad5-f7e236f2e6d0-node-pullsecrets\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.092070 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-config\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.092712 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6377a401-b10b-455a-8906-f6706302b91f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.093298 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-config\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.095152 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-audit\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.095228 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-image-import-ca\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.095315 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6377a401-b10b-455a-8906-f6706302b91f-audit-dir\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.095593 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.096438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/576b5a44-3c4c-4905-8d89-caed3b1eb43f-serving-cert\") pod \"openshift-config-operator-7777fb866f-hzs92\" (UID: \"576b5a44-3c4c-4905-8d89-caed3b1eb43f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.096495 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/576b5a44-3c4c-4905-8d89-caed3b1eb43f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-hzs92\" (UID: \"576b5a44-3c4c-4905-8d89-caed3b1eb43f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.097634 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/81b523ca-b564-45d4-bad5-f7e236f2e6d0-encryption-config\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.099975 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.101894 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/499830ff-8add-4caf-b469-d1cbde569fb7-serving-cert\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.104106 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-client-ca\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.104841 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541ea3c2-891c-4c3e-81fd-9d340112c62b-serving-cert\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.105435 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-config\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.107067 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/922314ab-f199-4117-acab-bc641c1cda57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ncqs4\" (UID: \"922314ab-f199-4117-acab-bc641c1cda57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.108174 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/81b523ca-b564-45d4-bad5-f7e236f2e6d0-etcd-client\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.110045 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/499830ff-8add-4caf-b469-d1cbde569fb7-trusted-ca\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.110094 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.110583 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.111202 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38051ff1-1715-41dd-aa28-53aea32c8e05-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.111776 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.112282 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-machine-approver-tls\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.112791 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.114534 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6377a401-b10b-455a-8906-f6706302b91f-etcd-client\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.114567 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.116696 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6377a401-b10b-455a-8906-f6706302b91f-serving-cert\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.117812 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.118648 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38051ff1-1715-41dd-aa28-53aea32c8e05-service-ca-bundle\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.119832 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b523ca-b564-45d4-bad5-f7e236f2e6d0-config\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.120272 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.120531 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.122055 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.122131 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81b523ca-b564-45d4-bad5-f7e236f2e6d0-serving-cert\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.122065 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wqf9f"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.122244 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.122567 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.123029 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.123031 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6377a401-b10b-455a-8906-f6706302b91f-encryption-config\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.123642 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a04b6-e7bd-4138-bcc7-4f650537aa24-serving-cert\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.123702 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.124067 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.131902 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/922314ab-f199-4117-acab-bc641c1cda57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ncqs4\" (UID: \"922314ab-f199-4117-acab-bc641c1cda57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.132075 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.132193 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.132282 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/edb14eaf-7738-4139-9b1b-9557e7e37ffc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vkrgj\" (UID: \"edb14eaf-7738-4139-9b1b-9557e7e37ffc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.135827 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bzk95"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.137176 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-99dtb"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.137272 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.140703 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vpnm5"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.140779 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ljds4"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.141580 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.142186 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.143133 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.145857 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.146544 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.155366 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.156198 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fsss9"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.158533 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hn5wx"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.159282 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hn5wx" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.160624 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.163755 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/499830ff-8add-4caf-b469-d1cbde569fb7-config\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.170703 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.171421 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.173207 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.173836 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.173859 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-trusted-ca\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.173915 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.173961 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.174007 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.174064 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b35d0df8-53f0-4787-b0b4-c93be28f0127-console-serving-cert\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.175042 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.175331 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.175779 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-oauth-serving-cert\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176012 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176036 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/639dacb9-2ea3-49d2-b5c4-996992c8e16a-etcd-ca\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176069 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/639dacb9-2ea3-49d2-b5c4-996992c8e16a-serving-cert\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176168 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176255 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176282 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz728\" (UniqueName: \"kubernetes.io/projected/b35d0df8-53f0-4787-b0b4-c93be28f0127-kube-api-access-sz728\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176580 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-oauth-serving-cert\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176736 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-trusted-ca-bundle\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176791 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/639dacb9-2ea3-49d2-b5c4-996992c8e16a-config\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176830 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-dir\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176854 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176882 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-policies\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176910 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.176988 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177038 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba3ef6a-8439-4317-bae9-01618d78512a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dv6hv\" (UID: \"cba3ef6a-8439-4317-bae9-01618d78512a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177071 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177435 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b35d0df8-53f0-4787-b0b4-c93be28f0127-console-oauth-config\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177476 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/639dacb9-2ea3-49d2-b5c4-996992c8e16a-etcd-client\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177503 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cba3ef6a-8439-4317-bae9-01618d78512a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dv6hv\" (UID: \"cba3ef6a-8439-4317-bae9-01618d78512a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177586 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-trusted-ca-bundle\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177679 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqgz8\" (UniqueName: \"kubernetes.io/projected/8cc29c02-baeb-4f46-92d6-684343509ae1-kube-api-access-vqgz8\") pod \"dns-operator-744455d44c-vpnm5\" (UID: \"8cc29c02-baeb-4f46-92d6-684343509ae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177681 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-policies\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177699 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/639dacb9-2ea3-49d2-b5c4-996992c8e16a-etcd-service-ca\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177718 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxdbz\" (UniqueName: \"kubernetes.io/projected/639dacb9-2ea3-49d2-b5c4-996992c8e16a-kube-api-access-qxdbz\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177757 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvc79\" (UniqueName: \"kubernetes.io/projected/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-kube-api-access-nvc79\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177778 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-metrics-tls\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177794 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-console-config\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177816 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cc29c02-baeb-4f46-92d6-684343509ae1-metrics-tls\") pod \"dns-operator-744455d44c-vpnm5\" (UID: \"8cc29c02-baeb-4f46-92d6-684343509ae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177859 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177877 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lsmw\" (UniqueName: \"kubernetes.io/projected/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-kube-api-access-9lsmw\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177895 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-service-ca\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177928 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba3ef6a-8439-4317-bae9-01618d78512a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dv6hv\" (UID: \"cba3ef6a-8439-4317-bae9-01618d78512a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.178013 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177118 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-dir\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177207 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.177077 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.178822 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-console-config\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.179494 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.181084 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.181092 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.181598 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.181768 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.183134 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b35d0df8-53f0-4787-b0b4-c93be28f0127-console-serving-cert\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.183705 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xprfh"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.184346 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.184480 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b35d0df8-53f0-4787-b0b4-c93be28f0127-console-oauth-config\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.185341 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.185383 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c7j22"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.187120 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f78bs"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.187575 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.188524 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.189368 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.190517 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8cc29c02-baeb-4f46-92d6-684343509ae1-metrics-tls\") pod \"dns-operator-744455d44c-vpnm5\" (UID: \"8cc29c02-baeb-4f46-92d6-684343509ae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.192400 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.193750 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/639dacb9-2ea3-49d2-b5c4-996992c8e16a-serving-cert\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.194376 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.195697 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.197255 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wqf9f"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.198298 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b35d0df8-53f0-4787-b0b4-c93be28f0127-service-ca\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.198978 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ljds4"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.200631 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.201702 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.202905 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.204678 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c7j22"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.206430 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bzk95"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.207914 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.209817 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.210914 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/639dacb9-2ea3-49d2-b5c4-996992c8e16a-etcd-client\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.211417 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt2jr"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.212623 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kgqfp"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.214284 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.215538 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.217578 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.218585 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.219689 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.220787 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f78bs"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.221449 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.223847 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.224881 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dmhxf"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.228424 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.228956 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.230101 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/639dacb9-2ea3-49d2-b5c4-996992c8e16a-config\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.233853 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hn5wx"] Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.242169 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.247248 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/639dacb9-2ea3-49d2-b5c4-996992c8e16a-etcd-ca\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.262173 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.269025 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/639dacb9-2ea3-49d2-b5c4-996992c8e16a-etcd-service-ca\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.280794 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.301345 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.320914 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.330092 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cba3ef6a-8439-4317-bae9-01618d78512a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dv6hv\" (UID: \"cba3ef6a-8439-4317-bae9-01618d78512a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.341097 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.348796 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cba3ef6a-8439-4317-bae9-01618d78512a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dv6hv\" (UID: \"cba3ef6a-8439-4317-bae9-01618d78512a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.361393 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.381559 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.401580 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.422105 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.441501 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.461904 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.472076 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-metrics-tls\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.481608 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.501470 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.526853 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.542281 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.542281 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.542299 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.553187 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.555812 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-trusted-ca\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.581296 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.602238 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.622876 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.641412 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.661477 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.681316 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.701794 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.720806 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.742595 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.762389 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.782884 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.802950 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.822791 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.842474 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.862047 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.881607 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.901993 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.938837 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r24m4\" (UniqueName: \"kubernetes.io/projected/922314ab-f199-4117-acab-bc641c1cda57-kube-api-access-r24m4\") pod \"openshift-apiserver-operator-796bbdcf4f-ncqs4\" (UID: \"922314ab-f199-4117-acab-bc641c1cda57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.956121 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvpv9\" (UniqueName: \"kubernetes.io/projected/edb14eaf-7738-4139-9b1b-9557e7e37ffc-kube-api-access-hvpv9\") pod \"cluster-samples-operator-665b6dd947-vkrgj\" (UID: \"edb14eaf-7738-4139-9b1b-9557e7e37ffc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" Jan 31 09:03:23 crc kubenswrapper[4732]: I0131 09:03:23.975923 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.011075 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.012319 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc5nb\" (UniqueName: \"kubernetes.io/projected/62bbd0d6-eba8-4737-9608-3f7a3dd6a157-kube-api-access-lc5nb\") pod \"machine-approver-56656f9798-sf8wd\" (UID: \"62bbd0d6-eba8-4737-9608-3f7a3dd6a157\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.023199 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.036403 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.042692 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.062126 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.097562 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdzrp\" (UniqueName: \"kubernetes.io/projected/81e1781e-a935-4f3f-b2aa-9a0807f43c73-kube-api-access-wdzrp\") pod \"downloads-7954f5f757-rt2jr\" (UID: \"81e1781e-a935-4f3f-b2aa-9a0807f43c73\") " pod="openshift-console/downloads-7954f5f757-rt2jr" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.100100 4732 request.go:700] Waited for 1.015520274s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.105349 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.118375 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjwq\" (UniqueName: \"kubernetes.io/projected/e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a-kube-api-access-lxjwq\") pod \"machine-api-operator-5694c8668f-54nxd\" (UID: \"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.121333 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.125708 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rt2jr" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.141961 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.171848 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.175375 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p24t7\" (UniqueName: \"kubernetes.io/projected/38051ff1-1715-41dd-aa28-53aea32c8e05-kube-api-access-p24t7\") pod \"authentication-operator-69f744f599-pxn6w\" (UID: \"38051ff1-1715-41dd-aa28-53aea32c8e05\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.197955 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr859\" (UniqueName: \"kubernetes.io/projected/499830ff-8add-4caf-b469-d1cbde569fb7-kube-api-access-jr859\") pod \"console-operator-58897d9998-76d6v\" (UID: \"499830ff-8add-4caf-b469-d1cbde569fb7\") " pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.209939 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.226164 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch8m2\" (UniqueName: \"kubernetes.io/projected/576b5a44-3c4c-4905-8d89-caed3b1eb43f-kube-api-access-ch8m2\") pod \"openshift-config-operator-7777fb866f-hzs92\" (UID: \"576b5a44-3c4c-4905-8d89-caed3b1eb43f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.243164 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bws5c\" (UniqueName: \"kubernetes.io/projected/219a04b6-e7bd-4138-bcc7-4f650537aa24-kube-api-access-bws5c\") pod \"controller-manager-879f6c89f-tg4xc\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.243708 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.269805 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.275876 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" event={"ID":"62bbd0d6-eba8-4737-9608-3f7a3dd6a157","Type":"ContainerStarted","Data":"cbd42d42f8482009793ac43a3ebdd0114adc30dd90602a90c07790e0ec9ac076"} Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.278273 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhhgt\" (UniqueName: \"kubernetes.io/projected/541ea3c2-891c-4c3e-81fd-9d340112c62b-kube-api-access-jhhgt\") pod \"route-controller-manager-6576b87f9c-5s6dp\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.300966 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9mtk\" (UniqueName: \"kubernetes.io/projected/81b523ca-b564-45d4-bad5-f7e236f2e6d0-kube-api-access-s9mtk\") pod \"apiserver-76f77b778f-xprfh\" (UID: \"81b523ca-b564-45d4-bad5-f7e236f2e6d0\") " pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.322391 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9ndf\" (UniqueName: \"kubernetes.io/projected/6377a401-b10b-455a-8906-f6706302b91f-kube-api-access-v9ndf\") pod \"apiserver-7bbb656c7d-85mvk\" (UID: \"6377a401-b10b-455a-8906-f6706302b91f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.351910 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.355351 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf682\" (UniqueName: \"kubernetes.io/projected/16c9233d-0b27-4994-bc3d-62d4ec86a4ec-kube-api-access-xf682\") pod \"cluster-image-registry-operator-dc59b4c8b-r9cz8\" (UID: \"16c9233d-0b27-4994-bc3d-62d4ec86a4ec\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.361594 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.374076 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.380937 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.385554 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.401899 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.414094 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.416105 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.422301 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.437736 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.440890 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.447182 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.449351 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rt2jr"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.450551 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pxn6w"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.451102 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.461425 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 09:03:24 crc kubenswrapper[4732]: W0131 09:03:24.461634 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e1781e_a935_4f3f_b2aa_9a0807f43c73.slice/crio-69d539fbc0bd1b9a5ce51d70edcc880d02dc1cb8c8cb69d25652009a0ba0e8ef WatchSource:0}: Error finding container 69d539fbc0bd1b9a5ce51d70edcc880d02dc1cb8c8cb69d25652009a0ba0e8ef: Status 404 returned error can't find the container with id 69d539fbc0bd1b9a5ce51d70edcc880d02dc1cb8c8cb69d25652009a0ba0e8ef Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.473136 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-hzs92"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.488570 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.502398 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.517848 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.521729 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.541699 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.544970 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.560627 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-54nxd"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.563213 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.581761 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.605243 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.622193 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.643074 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.663415 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.664732 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-76d6v"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.682256 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: W0131 09:03:24.687052 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod499830ff_8add_4caf_b469_d1cbde569fb7.slice/crio-641c54243ba760d90f524d9c623ae4bf9adfd23728fc476c0d4306671ed6afb3 WatchSource:0}: Error finding container 641c54243ba760d90f524d9c623ae4bf9adfd23728fc476c0d4306671ed6afb3: Status 404 returned error can't find the container with id 641c54243ba760d90f524d9c623ae4bf9adfd23728fc476c0d4306671ed6afb3 Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.721248 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.722486 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.733876 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tg4xc"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.741277 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.761321 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 09:03:24 crc kubenswrapper[4732]: W0131 09:03:24.765987 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod219a04b6_e7bd_4138_bcc7_4f650537aa24.slice/crio-371f900003f95ceb406307c3e3064fc3164d972cb97c8e25812cb0e57334cb37 WatchSource:0}: Error finding container 371f900003f95ceb406307c3e3064fc3164d972cb97c8e25812cb0e57334cb37: Status 404 returned error can't find the container with id 371f900003f95ceb406307c3e3064fc3164d972cb97c8e25812cb0e57334cb37 Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.782630 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.802101 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.821835 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.841915 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.845848 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xprfh"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.861837 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.881545 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.884888 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.900564 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.901579 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.902028 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8"] Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.922118 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.940971 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 09:03:24 crc kubenswrapper[4732]: W0131 09:03:24.960498 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6377a401_b10b_455a_8906_f6706302b91f.slice/crio-d339f7b2a8861419aa4238e205cfaa9dcdfd934fadeaf613a8eadba1a515f45e WatchSource:0}: Error finding container d339f7b2a8861419aa4238e205cfaa9dcdfd934fadeaf613a8eadba1a515f45e: Status 404 returned error can't find the container with id d339f7b2a8861419aa4238e205cfaa9dcdfd934fadeaf613a8eadba1a515f45e Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.961692 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 09:03:24 crc kubenswrapper[4732]: W0131 09:03:24.962881 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod541ea3c2_891c_4c3e_81fd_9d340112c62b.slice/crio-2ecd69d7eb9bd214f9aff82913ddef7756830d15321e64e3fd947685beb0e5f0 WatchSource:0}: Error finding container 2ecd69d7eb9bd214f9aff82913ddef7756830d15321e64e3fd947685beb0e5f0: Status 404 returned error can't find the container with id 2ecd69d7eb9bd214f9aff82913ddef7756830d15321e64e3fd947685beb0e5f0 Jan 31 09:03:24 crc kubenswrapper[4732]: I0131 09:03:24.981722 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.002055 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.021491 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.043745 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.061384 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.081638 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.119531 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz728\" (UniqueName: \"kubernetes.io/projected/b35d0df8-53f0-4787-b0b4-c93be28f0127-kube-api-access-sz728\") pod \"console-f9d7485db-8t8ks\" (UID: \"b35d0df8-53f0-4787-b0b4-c93be28f0127\") " pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.119705 4732 request.go:700] Waited for 1.941511015s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns-operator/serviceaccounts/dns-operator/token Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.136453 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqgz8\" (UniqueName: \"kubernetes.io/projected/8cc29c02-baeb-4f46-92d6-684343509ae1-kube-api-access-vqgz8\") pod \"dns-operator-744455d44c-vpnm5\" (UID: \"8cc29c02-baeb-4f46-92d6-684343509ae1\") " pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.160720 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxdbz\" (UniqueName: \"kubernetes.io/projected/639dacb9-2ea3-49d2-b5c4-996992c8e16a-kube-api-access-qxdbz\") pod \"etcd-operator-b45778765-fsss9\" (UID: \"639dacb9-2ea3-49d2-b5c4-996992c8e16a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.181269 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvc79\" (UniqueName: \"kubernetes.io/projected/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-kube-api-access-nvc79\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.197271 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64e4fed2-f31d-4d3f-ad77-d55fdddacb4d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-44wvz\" (UID: \"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.216065 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lsmw\" (UniqueName: \"kubernetes.io/projected/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-kube-api-access-9lsmw\") pod \"oauth-openshift-558db77b4-c8t6l\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.234795 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cba3ef6a-8439-4317-bae9-01618d78512a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dv6hv\" (UID: \"cba3ef6a-8439-4317-bae9-01618d78512a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.241596 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.261379 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.280876 4732 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.283505 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" event={"ID":"576b5a44-3c4c-4905-8d89-caed3b1eb43f","Type":"ContainerStarted","Data":"5eb2173f2f321ecef00d9ddbca74e917d0284c4b6d0c5c9a6a07b3f57416a7ef"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.283551 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" event={"ID":"576b5a44-3c4c-4905-8d89-caed3b1eb43f","Type":"ContainerStarted","Data":"57238662a5c119eb0c282562328b4dd2ebd61b348f8f10d57093c2429537bfd7"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.285443 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" event={"ID":"81b523ca-b564-45d4-bad5-f7e236f2e6d0","Type":"ContainerStarted","Data":"b938d3308a52bdac0bd04510a4849e86ea98820183607b27ed9d059b6063330b"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.287385 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" event={"ID":"219a04b6-e7bd-4138-bcc7-4f650537aa24","Type":"ContainerStarted","Data":"371f900003f95ceb406307c3e3064fc3164d972cb97c8e25812cb0e57334cb37"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.295291 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" event={"ID":"541ea3c2-891c-4c3e-81fd-9d340112c62b","Type":"ContainerStarted","Data":"2ecd69d7eb9bd214f9aff82913ddef7756830d15321e64e3fd947685beb0e5f0"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.298200 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" event={"ID":"922314ab-f199-4117-acab-bc641c1cda57","Type":"ContainerStarted","Data":"2a30e968ee3788a3dfb596a6a4ef6e3604ffde568b635c079c4305de05cec711"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.298232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" event={"ID":"922314ab-f199-4117-acab-bc641c1cda57","Type":"ContainerStarted","Data":"97ea83c954c01d03343eaa1002c3f9f954bc16cc777328cd201bd741063a8167"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.299681 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" event={"ID":"16c9233d-0b27-4994-bc3d-62d4ec86a4ec","Type":"ContainerStarted","Data":"3aa19a38da94888e1dd8da8784f9963d0705154dc8c4811ec56c194d07b436e7"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.301005 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.301282 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" event={"ID":"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a","Type":"ContainerStarted","Data":"dd6a20990ae3fd22e8557c81b6269326055d1e8ccf74975eda50405b7a5efb1f"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.301314 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" event={"ID":"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a","Type":"ContainerStarted","Data":"59a0a262020b0038e7bf24dab0914d3a017967ef6e3d45740704beb5029adfeb"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.302878 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" event={"ID":"62bbd0d6-eba8-4737-9608-3f7a3dd6a157","Type":"ContainerStarted","Data":"ac6d8de23a15f21cf46244c3780fb205834aa9873e224185ee55a963dedc4550"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.303884 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" event={"ID":"38051ff1-1715-41dd-aa28-53aea32c8e05","Type":"ContainerStarted","Data":"16afdb2e2edaf902175b439aaa6cd5fe6fd9707673a88258a80788c960c9e523"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.303914 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" event={"ID":"38051ff1-1715-41dd-aa28-53aea32c8e05","Type":"ContainerStarted","Data":"6c603e1663f65b4e996a658c7c5999f24b23f3d861e541eb09c170b40d6fee9b"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.306451 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" event={"ID":"6377a401-b10b-455a-8906-f6706302b91f","Type":"ContainerStarted","Data":"d339f7b2a8861419aa4238e205cfaa9dcdfd934fadeaf613a8eadba1a515f45e"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.307808 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" event={"ID":"edb14eaf-7738-4139-9b1b-9557e7e37ffc","Type":"ContainerStarted","Data":"dbeb278dd30abd13a8b2ea913fbd368c73fcf7210a7b11439c474d77cb50a386"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.309637 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt2jr" event={"ID":"81e1781e-a935-4f3f-b2aa-9a0807f43c73","Type":"ContainerStarted","Data":"40c260cfcf19a820277cc939bbd3087f26b08b3f91cab04be0242fa82609ec1b"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.309683 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rt2jr" event={"ID":"81e1781e-a935-4f3f-b2aa-9a0807f43c73","Type":"ContainerStarted","Data":"69d539fbc0bd1b9a5ce51d70edcc880d02dc1cb8c8cb69d25652009a0ba0e8ef"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.309892 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-rt2jr" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.310756 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-76d6v" event={"ID":"499830ff-8add-4caf-b469-d1cbde569fb7","Type":"ContainerStarted","Data":"e422637523d7d6fb77c3fa0f61232c3f748e50892051351e56744b31456a2e86"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.310784 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-76d6v" event={"ID":"499830ff-8add-4caf-b469-d1cbde569fb7","Type":"ContainerStarted","Data":"641c54243ba760d90f524d9c623ae4bf9adfd23728fc476c0d4306671ed6afb3"} Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.310942 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.311512 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt2jr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.311560 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt2jr" podUID="81e1781e-a935-4f3f-b2aa-9a0807f43c73" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.311879 4732 patch_prober.go:28] interesting pod/console-operator-58897d9998-76d6v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.311905 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-76d6v" podUID="499830ff-8add-4caf-b469-d1cbde569fb7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.325050 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.333109 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.340961 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.343889 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.358645 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.361224 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.366866 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.374544 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.381818 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.389878 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.403768 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.422723 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.443020 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.461307 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.490914 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.531845 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ac602fa-14af-4ae0-a538-d73e938db036-installation-pull-secrets\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.531896 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075f442e-a691-4856-a6ea-e21f1dcbcb20-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zcd2h\" (UID: \"075f442e-a691-4856-a6ea-e21f1dcbcb20\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.531945 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-registry-tls\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.531971 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnpjp\" (UniqueName: \"kubernetes.io/projected/075f442e-a691-4856-a6ea-e21f1dcbcb20-kube-api-access-hnpjp\") pod \"kube-storage-version-migrator-operator-b67b599dd-zcd2h\" (UID: \"075f442e-a691-4856-a6ea-e21f1dcbcb20\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.532046 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-bound-sa-token\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.532261 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-trusted-ca\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.532386 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ac602fa-14af-4ae0-a538-d73e938db036-ca-trust-extracted\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.532471 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb2p9\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-kube-api-access-bb2p9\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.532559 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.532607 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-registry-certificates\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.532633 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/075f442e-a691-4856-a6ea-e21f1dcbcb20-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zcd2h\" (UID: \"075f442e-a691-4856-a6ea-e21f1dcbcb20\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: E0131 09:03:25.533037 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.033015598 +0000 UTC m=+144.338891792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.541511 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.562346 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.634042 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:25 crc kubenswrapper[4732]: E0131 09:03:25.634447 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.13442105 +0000 UTC m=+144.440297254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.634499 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075f442e-a691-4856-a6ea-e21f1dcbcb20-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zcd2h\" (UID: \"075f442e-a691-4856-a6ea-e21f1dcbcb20\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.634541 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c0ada8b-e2dc-418c-a43e-33789285388f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.635281 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/075f442e-a691-4856-a6ea-e21f1dcbcb20-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zcd2h\" (UID: \"075f442e-a691-4856-a6ea-e21f1dcbcb20\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.635335 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-apiservice-cert\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.635459 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j96n4\" (UniqueName: \"kubernetes.io/projected/498d64fc-0d0f-43c6-aaae-bd3c5f0d7873-kube-api-access-j96n4\") pod \"control-plane-machine-set-operator-78cbb6b69f-v69mc\" (UID: \"498d64fc-0d0f-43c6-aaae-bd3c5f0d7873\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.635543 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-bound-sa-token\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.635724 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-plugins-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.635844 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12b99fdc-6d61-46e1-b093-b1b92efce54c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xgh7\" (UID: \"12b99fdc-6d61-46e1-b093-b1b92efce54c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.635875 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-tmpfs\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.635919 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33169e52-3fee-462c-b341-46563ddbf5aa-serving-cert\") pod \"service-ca-operator-777779d784-bzk95\" (UID: \"33169e52-3fee-462c-b341-46563ddbf5aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636009 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48bl\" (UniqueName: \"kubernetes.io/projected/56a9a21e-28d2-4386-9fe0-947c3a39ab6a-kube-api-access-s48bl\") pod \"migrator-59844c95c7-cx8cr\" (UID: \"56a9a21e-28d2-4386-9fe0-947c3a39ab6a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636053 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8814e7c8-5104-40f7-9761-4feedc15697b-stats-auth\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636093 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffad13f5-fb20-46ac-b886-c7e5a29b6599-config\") pod \"kube-controller-manager-operator-78b949d7b-bpncg\" (UID: \"ffad13f5-fb20-46ac-b886-c7e5a29b6599\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636126 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hprwd\" (UniqueName: \"kubernetes.io/projected/caaa7607-b47d-43ca-adff-f9135baf7262-kube-api-access-hprwd\") pod \"package-server-manager-789f6589d5-wq7bg\" (UID: \"caaa7607-b47d-43ca-adff-f9135baf7262\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636207 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w46qw\" (UniqueName: \"kubernetes.io/projected/a3302e69-0f73-4974-a8ac-af1992933147-kube-api-access-w46qw\") pod \"machine-config-controller-84d6567774-llpv8\" (UID: \"a3302e69-0f73-4974-a8ac-af1992933147\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636237 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54-config\") pod \"kube-apiserver-operator-766d6c64bb-4qmgq\" (UID: \"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636290 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwh5d\" (UniqueName: \"kubernetes.io/projected/12b99fdc-6d61-46e1-b093-b1b92efce54c-kube-api-access-xwh5d\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xgh7\" (UID: \"12b99fdc-6d61-46e1-b093-b1b92efce54c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636336 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ac602fa-14af-4ae0-a538-d73e938db036-ca-trust-extracted\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3302e69-0f73-4974-a8ac-af1992933147-proxy-tls\") pod \"machine-config-controller-84d6567774-llpv8\" (UID: \"a3302e69-0f73-4974-a8ac-af1992933147\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636406 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jzrl\" (UniqueName: \"kubernetes.io/projected/4149606a-3fbc-4da9-ba05-dc473b492a89-kube-api-access-2jzrl\") pod \"service-ca-9c57cc56f-wqf9f\" (UID: \"4149606a-3fbc-4da9-ba05-dc473b492a89\") " pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636450 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb2p9\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-kube-api-access-bb2p9\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636497 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffad13f5-fb20-46ac-b886-c7e5a29b6599-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bpncg\" (UID: \"ffad13f5-fb20-46ac-b886-c7e5a29b6599\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636546 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67nvg\" (UniqueName: \"kubernetes.io/projected/f4d0ed50-aa9b-4a62-b340-882ddf73f008-kube-api-access-67nvg\") pod \"marketplace-operator-79b997595-ljds4\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636703 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4149606a-3fbc-4da9-ba05-dc473b492a89-signing-key\") pod \"service-ca-9c57cc56f-wqf9f\" (UID: \"4149606a-3fbc-4da9-ba05-dc473b492a89\") " pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636767 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33169e52-3fee-462c-b341-46563ddbf5aa-config\") pod \"service-ca-operator-777779d784-bzk95\" (UID: \"33169e52-3fee-462c-b341-46563ddbf5aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636879 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e6408b2-d73d-4fc3-86f7-d29ea59ad32e-cert\") pod \"ingress-canary-hn5wx\" (UID: \"1e6408b2-d73d-4fc3-86f7-d29ea59ad32e\") " pod="openshift-ingress-canary/ingress-canary-hn5wx" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636908 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ac602fa-14af-4ae0-a538-d73e938db036-ca-trust-extracted\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.636916 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0558933a-c8d6-45dc-aeaf-af86190b15a0-config-volume\") pod \"collect-profiles-29497500-p2gn7\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637085 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-webhook-cert\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637182 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/058f5386-f340-4a52-bfc8-9b1a60515c9b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7fqsl\" (UID: \"058f5386-f340-4a52-bfc8-9b1a60515c9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637289 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8814e7c8-5104-40f7-9761-4feedc15697b-metrics-certs\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637343 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llg68\" (UniqueName: \"kubernetes.io/projected/1d02584d-db7d-4bc0-8cd9-33081993309b-kube-api-access-llg68\") pod \"catalog-operator-68c6474976-wb8wq\" (UID: \"1d02584d-db7d-4bc0-8cd9-33081993309b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637365 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzgzf\" (UniqueName: \"kubernetes.io/projected/eb80e0fc-6378-4d0d-a8b0-6c662073ed4d-kube-api-access-wzgzf\") pod \"dns-default-f78bs\" (UID: \"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d\") " pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637434 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-registry-certificates\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637466 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-csi-data-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637484 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b99fdc-6d61-46e1-b093-b1b92efce54c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xgh7\" (UID: \"12b99fdc-6d61-46e1-b093-b1b92efce54c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637508 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4qmgq\" (UID: \"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637525 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb72h\" (UniqueName: \"kubernetes.io/projected/33169e52-3fee-462c-b341-46563ddbf5aa-kube-api-access-rb72h\") pod \"service-ca-operator-777779d784-bzk95\" (UID: \"33169e52-3fee-462c-b341-46563ddbf5aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:25 crc kubenswrapper[4732]: E0131 09:03:25.637687 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.137634938 +0000 UTC m=+144.443511142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.637871 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ac602fa-14af-4ae0-a538-d73e938db036-installation-pull-secrets\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638080 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knh8t\" (UniqueName: \"kubernetes.io/projected/8814e7c8-5104-40f7-9761-4feedc15697b-kube-api-access-knh8t\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638185 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtvhl\" (UniqueName: \"kubernetes.io/projected/d64c27a7-b418-450e-9067-dde0cd145597-kube-api-access-xtvhl\") pod \"machine-config-server-dmhxf\" (UID: \"d64c27a7-b418-450e-9067-dde0cd145597\") " pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638405 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-registry-tls\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638541 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/caaa7607-b47d-43ca-adff-f9135baf7262-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-wq7bg\" (UID: \"caaa7607-b47d-43ca-adff-f9135baf7262\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638591 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1d02584d-db7d-4bc0-8cd9-33081993309b-srv-cert\") pod \"catalog-operator-68c6474976-wb8wq\" (UID: \"1d02584d-db7d-4bc0-8cd9-33081993309b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638635 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnpjp\" (UniqueName: \"kubernetes.io/projected/075f442e-a691-4856-a6ea-e21f1dcbcb20-kube-api-access-hnpjp\") pod \"kube-storage-version-migrator-operator-b67b599dd-zcd2h\" (UID: \"075f442e-a691-4856-a6ea-e21f1dcbcb20\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638753 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7c0ada8b-e2dc-418c-a43e-33789285388f-images\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638781 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4149606a-3fbc-4da9-ba05-dc473b492a89-signing-cabundle\") pod \"service-ca-9c57cc56f-wqf9f\" (UID: \"4149606a-3fbc-4da9-ba05-dc473b492a89\") " pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638849 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/058f5386-f340-4a52-bfc8-9b1a60515c9b-srv-cert\") pod \"olm-operator-6b444d44fb-7fqsl\" (UID: \"058f5386-f340-4a52-bfc8-9b1a60515c9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638878 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm54m\" (UniqueName: \"kubernetes.io/projected/058f5386-f340-4a52-bfc8-9b1a60515c9b-kube-api-access-dm54m\") pod \"olm-operator-6b444d44fb-7fqsl\" (UID: \"058f5386-f340-4a52-bfc8-9b1a60515c9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638898 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chpb\" (UniqueName: \"kubernetes.io/projected/7c0ada8b-e2dc-418c-a43e-33789285388f-kube-api-access-8chpb\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638937 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/498d64fc-0d0f-43c6-aaae-bd3c5f0d7873-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v69mc\" (UID: \"498d64fc-0d0f-43c6-aaae-bd3c5f0d7873\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.638958 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a3302e69-0f73-4974-a8ac-af1992933147-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-llpv8\" (UID: \"a3302e69-0f73-4974-a8ac-af1992933147\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639022 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-trusted-ca\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639125 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffad13f5-fb20-46ac-b886-c7e5a29b6599-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bpncg\" (UID: \"ffad13f5-fb20-46ac-b886-c7e5a29b6599\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639240 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0558933a-c8d6-45dc-aeaf-af86190b15a0-secret-volume\") pod \"collect-profiles-29497500-p2gn7\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639266 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb80e0fc-6378-4d0d-a8b0-6c662073ed4d-config-volume\") pod \"dns-default-f78bs\" (UID: \"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d\") " pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639317 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-mountpoint-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639352 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8814e7c8-5104-40f7-9761-4feedc15697b-service-ca-bundle\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639374 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d64c27a7-b418-450e-9067-dde0cd145597-certs\") pod \"machine-config-server-dmhxf\" (UID: \"d64c27a7-b418-450e-9067-dde0cd145597\") " pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639501 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d64c27a7-b418-450e-9067-dde0cd145597-node-bootstrap-token\") pod \"machine-config-server-dmhxf\" (UID: \"d64c27a7-b418-450e-9067-dde0cd145597\") " pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639522 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1d02584d-db7d-4bc0-8cd9-33081993309b-profile-collector-cert\") pod \"catalog-operator-68c6474976-wb8wq\" (UID: \"1d02584d-db7d-4bc0-8cd9-33081993309b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639539 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4qmgq\" (UID: \"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.639604 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ljds4\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.640623 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-registry-certificates\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.640673 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8814e7c8-5104-40f7-9761-4feedc15697b-default-certificate\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.640714 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-socket-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.641117 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h7wn\" (UniqueName: \"kubernetes.io/projected/2325b276-ee4d-438d-b9d6-d7de3024ba96-kube-api-access-6h7wn\") pod \"multus-admission-controller-857f4d67dd-kgqfp\" (UID: \"2325b276-ee4d-438d-b9d6-d7de3024ba96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.641358 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-trusted-ca\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.641526 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwqdf\" (UniqueName: \"kubernetes.io/projected/1e6408b2-d73d-4fc3-86f7-d29ea59ad32e-kube-api-access-rwqdf\") pod \"ingress-canary-hn5wx\" (UID: \"1e6408b2-d73d-4fc3-86f7-d29ea59ad32e\") " pod="openshift-ingress-canary/ingress-canary-hn5wx" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.643368 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb80e0fc-6378-4d0d-a8b0-6c662073ed4d-metrics-tls\") pod \"dns-default-f78bs\" (UID: \"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d\") " pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.643409 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-registration-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.643443 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c0ada8b-e2dc-418c-a43e-33789285388f-proxy-tls\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.643926 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/075f442e-a691-4856-a6ea-e21f1dcbcb20-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zcd2h\" (UID: \"075f442e-a691-4856-a6ea-e21f1dcbcb20\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.643958 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh5t6\" (UniqueName: \"kubernetes.io/projected/89cec875-cd1f-4867-8b4b-ca72c57c974b-kube-api-access-dh5t6\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.643983 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ljds4\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.644052 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2325b276-ee4d-438d-b9d6-d7de3024ba96-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kgqfp\" (UID: \"2325b276-ee4d-438d-b9d6-d7de3024ba96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.644161 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn2dm\" (UniqueName: \"kubernetes.io/projected/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-kube-api-access-hn2dm\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.644340 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txg4c\" (UniqueName: \"kubernetes.io/projected/0558933a-c8d6-45dc-aeaf-af86190b15a0-kube-api-access-txg4c\") pod \"collect-profiles-29497500-p2gn7\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.646172 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ac602fa-14af-4ae0-a538-d73e938db036-installation-pull-secrets\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.646235 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-registry-tls\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.647366 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/075f442e-a691-4856-a6ea-e21f1dcbcb20-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zcd2h\" (UID: \"075f442e-a691-4856-a6ea-e21f1dcbcb20\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.681367 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-bound-sa-token\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.714974 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb2p9\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-kube-api-access-bb2p9\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.721362 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnpjp\" (UniqueName: \"kubernetes.io/projected/075f442e-a691-4856-a6ea-e21f1dcbcb20-kube-api-access-hnpjp\") pod \"kube-storage-version-migrator-operator-b67b599dd-zcd2h\" (UID: \"075f442e-a691-4856-a6ea-e21f1dcbcb20\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.730843 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv"] Jan 31 09:03:25 crc kubenswrapper[4732]: W0131 09:03:25.742946 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba3ef6a_8439_4317_bae9_01618d78512a.slice/crio-47909dae9b68f113cd9b76b94a15367d6632b514c137e8042997f03d94d0f645 WatchSource:0}: Error finding container 47909dae9b68f113cd9b76b94a15367d6632b514c137e8042997f03d94d0f645: Status 404 returned error can't find the container with id 47909dae9b68f113cd9b76b94a15367d6632b514c137e8042997f03d94d0f645 Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.748247 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:25 crc kubenswrapper[4732]: E0131 09:03:25.748451 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.248412816 +0000 UTC m=+144.554289020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749289 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-webhook-cert\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749351 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/058f5386-f340-4a52-bfc8-9b1a60515c9b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7fqsl\" (UID: \"058f5386-f340-4a52-bfc8-9b1a60515c9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749407 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749457 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8814e7c8-5104-40f7-9761-4feedc15697b-metrics-certs\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749524 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llg68\" (UniqueName: \"kubernetes.io/projected/1d02584d-db7d-4bc0-8cd9-33081993309b-kube-api-access-llg68\") pod \"catalog-operator-68c6474976-wb8wq\" (UID: \"1d02584d-db7d-4bc0-8cd9-33081993309b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749572 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzgzf\" (UniqueName: \"kubernetes.io/projected/eb80e0fc-6378-4d0d-a8b0-6c662073ed4d-kube-api-access-wzgzf\") pod \"dns-default-f78bs\" (UID: \"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d\") " pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749605 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-csi-data-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749646 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b99fdc-6d61-46e1-b093-b1b92efce54c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xgh7\" (UID: \"12b99fdc-6d61-46e1-b093-b1b92efce54c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749691 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4qmgq\" (UID: \"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749712 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb72h\" (UniqueName: \"kubernetes.io/projected/33169e52-3fee-462c-b341-46563ddbf5aa-kube-api-access-rb72h\") pod \"service-ca-operator-777779d784-bzk95\" (UID: \"33169e52-3fee-462c-b341-46563ddbf5aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749753 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knh8t\" (UniqueName: \"kubernetes.io/projected/8814e7c8-5104-40f7-9761-4feedc15697b-kube-api-access-knh8t\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749777 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/caaa7607-b47d-43ca-adff-f9135baf7262-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-wq7bg\" (UID: \"caaa7607-b47d-43ca-adff-f9135baf7262\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749824 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtvhl\" (UniqueName: \"kubernetes.io/projected/d64c27a7-b418-450e-9067-dde0cd145597-kube-api-access-xtvhl\") pod \"machine-config-server-dmhxf\" (UID: \"d64c27a7-b418-450e-9067-dde0cd145597\") " pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749850 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1d02584d-db7d-4bc0-8cd9-33081993309b-srv-cert\") pod \"catalog-operator-68c6474976-wb8wq\" (UID: \"1d02584d-db7d-4bc0-8cd9-33081993309b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749912 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4149606a-3fbc-4da9-ba05-dc473b492a89-signing-cabundle\") pod \"service-ca-9c57cc56f-wqf9f\" (UID: \"4149606a-3fbc-4da9-ba05-dc473b492a89\") " pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749945 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7c0ada8b-e2dc-418c-a43e-33789285388f-images\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.749983 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8chpb\" (UniqueName: \"kubernetes.io/projected/7c0ada8b-e2dc-418c-a43e-33789285388f-kube-api-access-8chpb\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750024 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/058f5386-f340-4a52-bfc8-9b1a60515c9b-srv-cert\") pod \"olm-operator-6b444d44fb-7fqsl\" (UID: \"058f5386-f340-4a52-bfc8-9b1a60515c9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750061 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm54m\" (UniqueName: \"kubernetes.io/projected/058f5386-f340-4a52-bfc8-9b1a60515c9b-kube-api-access-dm54m\") pod \"olm-operator-6b444d44fb-7fqsl\" (UID: \"058f5386-f340-4a52-bfc8-9b1a60515c9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750103 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/498d64fc-0d0f-43c6-aaae-bd3c5f0d7873-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v69mc\" (UID: \"498d64fc-0d0f-43c6-aaae-bd3c5f0d7873\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750126 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a3302e69-0f73-4974-a8ac-af1992933147-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-llpv8\" (UID: \"a3302e69-0f73-4974-a8ac-af1992933147\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750150 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffad13f5-fb20-46ac-b886-c7e5a29b6599-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bpncg\" (UID: \"ffad13f5-fb20-46ac-b886-c7e5a29b6599\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750175 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0558933a-c8d6-45dc-aeaf-af86190b15a0-secret-volume\") pod \"collect-profiles-29497500-p2gn7\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750193 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb80e0fc-6378-4d0d-a8b0-6c662073ed4d-config-volume\") pod \"dns-default-f78bs\" (UID: \"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d\") " pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750231 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-mountpoint-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750286 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8814e7c8-5104-40f7-9761-4feedc15697b-service-ca-bundle\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750314 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d64c27a7-b418-450e-9067-dde0cd145597-certs\") pod \"machine-config-server-dmhxf\" (UID: \"d64c27a7-b418-450e-9067-dde0cd145597\") " pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750414 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d64c27a7-b418-450e-9067-dde0cd145597-node-bootstrap-token\") pod \"machine-config-server-dmhxf\" (UID: \"d64c27a7-b418-450e-9067-dde0cd145597\") " pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750634 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1d02584d-db7d-4bc0-8cd9-33081993309b-profile-collector-cert\") pod \"catalog-operator-68c6474976-wb8wq\" (UID: \"1d02584d-db7d-4bc0-8cd9-33081993309b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750690 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-socket-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750717 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4qmgq\" (UID: \"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750781 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ljds4\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750834 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8814e7c8-5104-40f7-9761-4feedc15697b-default-certificate\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.750964 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h7wn\" (UniqueName: \"kubernetes.io/projected/2325b276-ee4d-438d-b9d6-d7de3024ba96-kube-api-access-6h7wn\") pod \"multus-admission-controller-857f4d67dd-kgqfp\" (UID: \"2325b276-ee4d-438d-b9d6-d7de3024ba96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751106 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-csi-data-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751048 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwqdf\" (UniqueName: \"kubernetes.io/projected/1e6408b2-d73d-4fc3-86f7-d29ea59ad32e-kube-api-access-rwqdf\") pod \"ingress-canary-hn5wx\" (UID: \"1e6408b2-d73d-4fc3-86f7-d29ea59ad32e\") " pod="openshift-ingress-canary/ingress-canary-hn5wx" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751197 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb80e0fc-6378-4d0d-a8b0-6c662073ed4d-metrics-tls\") pod \"dns-default-f78bs\" (UID: \"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d\") " pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751280 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-registration-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751304 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c0ada8b-e2dc-418c-a43e-33789285388f-proxy-tls\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: E0131 09:03:25.751336 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.251319234 +0000 UTC m=+144.557195438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751379 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh5t6\" (UniqueName: \"kubernetes.io/projected/89cec875-cd1f-4867-8b4b-ca72c57c974b-kube-api-access-dh5t6\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751418 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ljds4\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2325b276-ee4d-438d-b9d6-d7de3024ba96-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kgqfp\" (UID: \"2325b276-ee4d-438d-b9d6-d7de3024ba96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751492 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn2dm\" (UniqueName: \"kubernetes.io/projected/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-kube-api-access-hn2dm\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751517 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txg4c\" (UniqueName: \"kubernetes.io/projected/0558933a-c8d6-45dc-aeaf-af86190b15a0-kube-api-access-txg4c\") pod \"collect-profiles-29497500-p2gn7\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751547 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c0ada8b-e2dc-418c-a43e-33789285388f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751614 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-apiservice-cert\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751643 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j96n4\" (UniqueName: \"kubernetes.io/projected/498d64fc-0d0f-43c6-aaae-bd3c5f0d7873-kube-api-access-j96n4\") pod \"control-plane-machine-set-operator-78cbb6b69f-v69mc\" (UID: \"498d64fc-0d0f-43c6-aaae-bd3c5f0d7873\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751677 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-tmpfs\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751781 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-plugins-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751805 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12b99fdc-6d61-46e1-b093-b1b92efce54c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xgh7\" (UID: \"12b99fdc-6d61-46e1-b093-b1b92efce54c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751829 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s48bl\" (UniqueName: \"kubernetes.io/projected/56a9a21e-28d2-4386-9fe0-947c3a39ab6a-kube-api-access-s48bl\") pod \"migrator-59844c95c7-cx8cr\" (UID: \"56a9a21e-28d2-4386-9fe0-947c3a39ab6a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.751869 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33169e52-3fee-462c-b341-46563ddbf5aa-serving-cert\") pod \"service-ca-operator-777779d784-bzk95\" (UID: \"33169e52-3fee-462c-b341-46563ddbf5aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752184 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8814e7c8-5104-40f7-9761-4feedc15697b-stats-auth\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752287 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffad13f5-fb20-46ac-b886-c7e5a29b6599-config\") pod \"kube-controller-manager-operator-78b949d7b-bpncg\" (UID: \"ffad13f5-fb20-46ac-b886-c7e5a29b6599\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752314 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hprwd\" (UniqueName: \"kubernetes.io/projected/caaa7607-b47d-43ca-adff-f9135baf7262-kube-api-access-hprwd\") pod \"package-server-manager-789f6589d5-wq7bg\" (UID: \"caaa7607-b47d-43ca-adff-f9135baf7262\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752344 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w46qw\" (UniqueName: \"kubernetes.io/projected/a3302e69-0f73-4974-a8ac-af1992933147-kube-api-access-w46qw\") pod \"machine-config-controller-84d6567774-llpv8\" (UID: \"a3302e69-0f73-4974-a8ac-af1992933147\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752495 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwh5d\" (UniqueName: \"kubernetes.io/projected/12b99fdc-6d61-46e1-b093-b1b92efce54c-kube-api-access-xwh5d\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xgh7\" (UID: \"12b99fdc-6d61-46e1-b093-b1b92efce54c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752526 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54-config\") pod \"kube-apiserver-operator-766d6c64bb-4qmgq\" (UID: \"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752592 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzrl\" (UniqueName: \"kubernetes.io/projected/4149606a-3fbc-4da9-ba05-dc473b492a89-kube-api-access-2jzrl\") pod \"service-ca-9c57cc56f-wqf9f\" (UID: \"4149606a-3fbc-4da9-ba05-dc473b492a89\") " pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752623 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3302e69-0f73-4974-a8ac-af1992933147-proxy-tls\") pod \"machine-config-controller-84d6567774-llpv8\" (UID: \"a3302e69-0f73-4974-a8ac-af1992933147\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752650 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffad13f5-fb20-46ac-b886-c7e5a29b6599-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bpncg\" (UID: \"ffad13f5-fb20-46ac-b886-c7e5a29b6599\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752692 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67nvg\" (UniqueName: \"kubernetes.io/projected/f4d0ed50-aa9b-4a62-b340-882ddf73f008-kube-api-access-67nvg\") pod \"marketplace-operator-79b997595-ljds4\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752732 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4149606a-3fbc-4da9-ba05-dc473b492a89-signing-key\") pod \"service-ca-9c57cc56f-wqf9f\" (UID: \"4149606a-3fbc-4da9-ba05-dc473b492a89\") " pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752916 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33169e52-3fee-462c-b341-46563ddbf5aa-config\") pod \"service-ca-operator-777779d784-bzk95\" (UID: \"33169e52-3fee-462c-b341-46563ddbf5aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.752969 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e6408b2-d73d-4fc3-86f7-d29ea59ad32e-cert\") pod \"ingress-canary-hn5wx\" (UID: \"1e6408b2-d73d-4fc3-86f7-d29ea59ad32e\") " pod="openshift-ingress-canary/ingress-canary-hn5wx" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.753030 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0558933a-c8d6-45dc-aeaf-af86190b15a0-config-volume\") pod \"collect-profiles-29497500-p2gn7\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.753557 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-socket-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.753919 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-mountpoint-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.754193 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-registration-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.754253 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12b99fdc-6d61-46e1-b093-b1b92efce54c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xgh7\" (UID: \"12b99fdc-6d61-46e1-b093-b1b92efce54c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.754675 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb80e0fc-6378-4d0d-a8b0-6c662073ed4d-config-volume\") pod \"dns-default-f78bs\" (UID: \"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d\") " pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.754751 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ljds4\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.754920 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a3302e69-0f73-4974-a8ac-af1992933147-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-llpv8\" (UID: \"a3302e69-0f73-4974-a8ac-af1992933147\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.755041 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffad13f5-fb20-46ac-b886-c7e5a29b6599-config\") pod \"kube-controller-manager-operator-78b949d7b-bpncg\" (UID: \"ffad13f5-fb20-46ac-b886-c7e5a29b6599\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.756191 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/058f5386-f340-4a52-bfc8-9b1a60515c9b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7fqsl\" (UID: \"058f5386-f340-4a52-bfc8-9b1a60515c9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.756341 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/89cec875-cd1f-4867-8b4b-ca72c57c974b-plugins-dir\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.757910 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/caaa7607-b47d-43ca-adff-f9135baf7262-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-wq7bg\" (UID: \"caaa7607-b47d-43ca-adff-f9135baf7262\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.757970 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8814e7c8-5104-40f7-9761-4feedc15697b-metrics-certs\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.758298 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb80e0fc-6378-4d0d-a8b0-6c662073ed4d-metrics-tls\") pod \"dns-default-f78bs\" (UID: \"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d\") " pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.758354 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d64c27a7-b418-450e-9067-dde0cd145597-node-bootstrap-token\") pod \"machine-config-server-dmhxf\" (UID: \"d64c27a7-b418-450e-9067-dde0cd145597\") " pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.759437 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3302e69-0f73-4974-a8ac-af1992933147-proxy-tls\") pod \"machine-config-controller-84d6567774-llpv8\" (UID: \"a3302e69-0f73-4974-a8ac-af1992933147\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.760250 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-webhook-cert\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.760276 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ljds4\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.761052 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8814e7c8-5104-40f7-9761-4feedc15697b-default-certificate\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.761790 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c0ada8b-e2dc-418c-a43e-33789285388f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.761814 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33169e52-3fee-462c-b341-46563ddbf5aa-config\") pod \"service-ca-operator-777779d784-bzk95\" (UID: \"33169e52-3fee-462c-b341-46563ddbf5aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.761874 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4qmgq\" (UID: \"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.762026 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/058f5386-f340-4a52-bfc8-9b1a60515c9b-srv-cert\") pod \"olm-operator-6b444d44fb-7fqsl\" (UID: \"058f5386-f340-4a52-bfc8-9b1a60515c9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.762613 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8814e7c8-5104-40f7-9761-4feedc15697b-service-ca-bundle\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.762710 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8814e7c8-5104-40f7-9761-4feedc15697b-stats-auth\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.762940 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0558933a-c8d6-45dc-aeaf-af86190b15a0-secret-volume\") pod \"collect-profiles-29497500-p2gn7\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.762939 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/498d64fc-0d0f-43c6-aaae-bd3c5f0d7873-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v69mc\" (UID: \"498d64fc-0d0f-43c6-aaae-bd3c5f0d7873\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.763014 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0558933a-c8d6-45dc-aeaf-af86190b15a0-config-volume\") pod \"collect-profiles-29497500-p2gn7\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.763114 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1d02584d-db7d-4bc0-8cd9-33081993309b-profile-collector-cert\") pod \"catalog-operator-68c6474976-wb8wq\" (UID: \"1d02584d-db7d-4bc0-8cd9-33081993309b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.763136 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54-config\") pod \"kube-apiserver-operator-766d6c64bb-4qmgq\" (UID: \"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.763456 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-tmpfs\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.763561 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d64c27a7-b418-450e-9067-dde0cd145597-certs\") pod \"machine-config-server-dmhxf\" (UID: \"d64c27a7-b418-450e-9067-dde0cd145597\") " pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.763881 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffad13f5-fb20-46ac-b886-c7e5a29b6599-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bpncg\" (UID: \"ffad13f5-fb20-46ac-b886-c7e5a29b6599\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.764119 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1e6408b2-d73d-4fc3-86f7-d29ea59ad32e-cert\") pod \"ingress-canary-hn5wx\" (UID: \"1e6408b2-d73d-4fc3-86f7-d29ea59ad32e\") " pod="openshift-ingress-canary/ingress-canary-hn5wx" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.764171 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1d02584d-db7d-4bc0-8cd9-33081993309b-srv-cert\") pod \"catalog-operator-68c6474976-wb8wq\" (UID: \"1d02584d-db7d-4bc0-8cd9-33081993309b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.764625 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7c0ada8b-e2dc-418c-a43e-33789285388f-images\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.764766 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4149606a-3fbc-4da9-ba05-dc473b492a89-signing-cabundle\") pod \"service-ca-9c57cc56f-wqf9f\" (UID: \"4149606a-3fbc-4da9-ba05-dc473b492a89\") " pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.765150 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-apiservice-cert\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.765285 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2325b276-ee4d-438d-b9d6-d7de3024ba96-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kgqfp\" (UID: \"2325b276-ee4d-438d-b9d6-d7de3024ba96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.765631 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12b99fdc-6d61-46e1-b093-b1b92efce54c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xgh7\" (UID: \"12b99fdc-6d61-46e1-b093-b1b92efce54c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.765729 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33169e52-3fee-462c-b341-46563ddbf5aa-serving-cert\") pod \"service-ca-operator-777779d784-bzk95\" (UID: \"33169e52-3fee-462c-b341-46563ddbf5aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.766615 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4149606a-3fbc-4da9-ba05-dc473b492a89-signing-key\") pod \"service-ca-9c57cc56f-wqf9f\" (UID: \"4149606a-3fbc-4da9-ba05-dc473b492a89\") " pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.768399 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c0ada8b-e2dc-418c-a43e-33789285388f-proxy-tls\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.795366 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb72h\" (UniqueName: \"kubernetes.io/projected/33169e52-3fee-462c-b341-46563ddbf5aa-kube-api-access-rb72h\") pod \"service-ca-operator-777779d784-bzk95\" (UID: \"33169e52-3fee-462c-b341-46563ddbf5aa\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.816764 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llg68\" (UniqueName: \"kubernetes.io/projected/1d02584d-db7d-4bc0-8cd9-33081993309b-kube-api-access-llg68\") pod \"catalog-operator-68c6474976-wb8wq\" (UID: \"1d02584d-db7d-4bc0-8cd9-33081993309b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.838697 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzgzf\" (UniqueName: \"kubernetes.io/projected/eb80e0fc-6378-4d0d-a8b0-6c662073ed4d-kube-api-access-wzgzf\") pod \"dns-default-f78bs\" (UID: \"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d\") " pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.844432 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8t8ks"] Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.852005 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.853840 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:25 crc kubenswrapper[4732]: E0131 09:03:25.854183 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.353969758 +0000 UTC m=+144.659845962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.854560 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:25 crc kubenswrapper[4732]: E0131 09:03:25.854935 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.35491771 +0000 UTC m=+144.660793904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.860965 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh5t6\" (UniqueName: \"kubernetes.io/projected/89cec875-cd1f-4867-8b4b-ca72c57c974b-kube-api-access-dh5t6\") pod \"csi-hostpathplugin-c7j22\" (UID: \"89cec875-cd1f-4867-8b4b-ca72c57c974b\") " pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.863744 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8t6l"] Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.874300 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vpnm5"] Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.876315 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fsss9"] Jan 31 09:03:25 crc kubenswrapper[4732]: W0131 09:03:25.886982 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc29c02_baeb_4f46_92d6_684343509ae1.slice/crio-028c79adb4156659acd73bab76af06e2c4665808280d16be70d725ea649b82b3 WatchSource:0}: Error finding container 028c79adb4156659acd73bab76af06e2c4665808280d16be70d725ea649b82b3: Status 404 returned error can't find the container with id 028c79adb4156659acd73bab76af06e2c4665808280d16be70d725ea649b82b3 Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.889096 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm54m\" (UniqueName: \"kubernetes.io/projected/058f5386-f340-4a52-bfc8-9b1a60515c9b-kube-api-access-dm54m\") pod \"olm-operator-6b444d44fb-7fqsl\" (UID: \"058f5386-f340-4a52-bfc8-9b1a60515c9b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.896274 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knh8t\" (UniqueName: \"kubernetes.io/projected/8814e7c8-5104-40f7-9761-4feedc15697b-kube-api-access-knh8t\") pod \"router-default-5444994796-h5q9f\" (UID: \"8814e7c8-5104-40f7-9761-4feedc15697b\") " pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.927305 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h7wn\" (UniqueName: \"kubernetes.io/projected/2325b276-ee4d-438d-b9d6-d7de3024ba96-kube-api-access-6h7wn\") pod \"multus-admission-controller-857f4d67dd-kgqfp\" (UID: \"2325b276-ee4d-438d-b9d6-d7de3024ba96\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.947687 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz"] Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.951274 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4qmgq\" (UID: \"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.956618 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:25 crc kubenswrapper[4732]: E0131 09:03:25.957053 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.457024786 +0000 UTC m=+144.762900990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.963166 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffad13f5-fb20-46ac-b886-c7e5a29b6599-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bpncg\" (UID: \"ffad13f5-fb20-46ac-b886-c7e5a29b6599\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.982145 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.985792 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzrl\" (UniqueName: \"kubernetes.io/projected/4149606a-3fbc-4da9-ba05-dc473b492a89-kube-api-access-2jzrl\") pod \"service-ca-9c57cc56f-wqf9f\" (UID: \"4149606a-3fbc-4da9-ba05-dc473b492a89\") " pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.996399 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:25 crc kubenswrapper[4732]: I0131 09:03:25.998415 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67nvg\" (UniqueName: \"kubernetes.io/projected/f4d0ed50-aa9b-4a62-b340-882ddf73f008-kube-api-access-67nvg\") pod \"marketplace-operator-79b997595-ljds4\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.011155 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.020499 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtvhl\" (UniqueName: \"kubernetes.io/projected/d64c27a7-b418-450e-9067-dde0cd145597-kube-api-access-xtvhl\") pod \"machine-config-server-dmhxf\" (UID: \"d64c27a7-b418-450e-9067-dde0cd145597\") " pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:26 crc kubenswrapper[4732]: W0131 09:03:26.029502 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8814e7c8_5104_40f7_9761_4feedc15697b.slice/crio-810a7219fbe0602d94b66784880ef250eec1b53ec5dbf8b9e30ef55ffb36b889 WatchSource:0}: Error finding container 810a7219fbe0602d94b66784880ef250eec1b53ec5dbf8b9e30ef55ffb36b889: Status 404 returned error can't find the container with id 810a7219fbe0602d94b66784880ef250eec1b53ec5dbf8b9e30ef55ffb36b889 Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.035017 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.046568 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hprwd\" (UniqueName: \"kubernetes.io/projected/caaa7607-b47d-43ca-adff-f9135baf7262-kube-api-access-hprwd\") pod \"package-server-manager-789f6589d5-wq7bg\" (UID: \"caaa7607-b47d-43ca-adff-f9135baf7262\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.058548 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.060102 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.060699 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.560653933 +0000 UTC m=+144.866530137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.064351 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.067109 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w46qw\" (UniqueName: \"kubernetes.io/projected/a3302e69-0f73-4974-a8ac-af1992933147-kube-api-access-w46qw\") pod \"machine-config-controller-84d6567774-llpv8\" (UID: \"a3302e69-0f73-4974-a8ac-af1992933147\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.069731 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.080397 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.081255 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwh5d\" (UniqueName: \"kubernetes.io/projected/12b99fdc-6d61-46e1-b093-b1b92efce54c-kube-api-access-xwh5d\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xgh7\" (UID: \"12b99fdc-6d61-46e1-b093-b1b92efce54c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.082832 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f78bs"] Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.085062 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.095482 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.105657 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j96n4\" (UniqueName: \"kubernetes.io/projected/498d64fc-0d0f-43c6-aaae-bd3c5f0d7873-kube-api-access-j96n4\") pod \"control-plane-machine-set-operator-78cbb6b69f-v69mc\" (UID: \"498d64fc-0d0f-43c6-aaae-bd3c5f0d7873\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.115191 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.121931 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwqdf\" (UniqueName: \"kubernetes.io/projected/1e6408b2-d73d-4fc3-86f7-d29ea59ad32e-kube-api-access-rwqdf\") pod \"ingress-canary-hn5wx\" (UID: \"1e6408b2-d73d-4fc3-86f7-d29ea59ad32e\") " pod="openshift-ingress-canary/ingress-canary-hn5wx" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.140356 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-c7j22" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.157344 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn2dm\" (UniqueName: \"kubernetes.io/projected/6a680afa-dc56-4bf8-808c-b1c947c8fbf0-kube-api-access-hn2dm\") pod \"packageserver-d55dfcdfc-6fwnh\" (UID: \"6a680afa-dc56-4bf8-808c-b1c947c8fbf0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.159808 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dmhxf" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.160721 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txg4c\" (UniqueName: \"kubernetes.io/projected/0558933a-c8d6-45dc-aeaf-af86190b15a0-kube-api-access-txg4c\") pod \"collect-profiles-29497500-p2gn7\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.161081 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.161358 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.661329581 +0000 UTC m=+144.967205785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.161893 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.162554 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.662539742 +0000 UTC m=+144.968415946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.178708 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48bl\" (UniqueName: \"kubernetes.io/projected/56a9a21e-28d2-4386-9fe0-947c3a39ab6a-kube-api-access-s48bl\") pod \"migrator-59844c95c7-cx8cr\" (UID: \"56a9a21e-28d2-4386-9fe0-947c3a39ab6a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.196906 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.217429 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chpb\" (UniqueName: \"kubernetes.io/projected/7c0ada8b-e2dc-418c-a43e-33789285388f-kube-api-access-8chpb\") pod \"machine-config-operator-74547568cd-4sk2n\" (UID: \"7c0ada8b-e2dc-418c-a43e-33789285388f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.265396 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.265823 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.765801097 +0000 UTC m=+145.071677301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.303423 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.305519 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h"] Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.318310 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.326203 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.344863 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.350537 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.369806 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.371632 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl"] Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.380076 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" event={"ID":"cba3ef6a-8439-4317-bae9-01618d78512a","Type":"ContainerStarted","Data":"9c7588f616c64cc2c59120e413176a76c8f1894b15043d1fcba7f8966cddaa81"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.380147 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" event={"ID":"cba3ef6a-8439-4317-bae9-01618d78512a","Type":"ContainerStarted","Data":"47909dae9b68f113cd9b76b94a15367d6632b514c137e8042997f03d94d0f645"} Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.380904 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.88088602 +0000 UTC m=+145.186762224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.400516 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.408370 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hn5wx" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.428948 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8t8ks" event={"ID":"b35d0df8-53f0-4787-b0b4-c93be28f0127","Type":"ContainerStarted","Data":"4a39f64f86fe6cb1367f5507d76711de9f59d5b40811435a28504bbb19c0815e"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.428999 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8t8ks" event={"ID":"b35d0df8-53f0-4787-b0b4-c93be28f0127","Type":"ContainerStarted","Data":"cc5cb0ee220b6ce5469a8d59a01d7b091efed469d2881653bd6be64789fc28e5"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.468872 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" event={"ID":"16c9233d-0b27-4994-bc3d-62d4ec86a4ec","Type":"ContainerStarted","Data":"2cfe53f46628fdc8be7a5a78124aacf0f46a3c4f4093665f812038778db568cc"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.470572 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.471517 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:26.971493588 +0000 UTC m=+145.277369792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.490124 4732 generic.go:334] "Generic (PLEG): container finished" podID="576b5a44-3c4c-4905-8d89-caed3b1eb43f" containerID="5eb2173f2f321ecef00d9ddbca74e917d0284c4b6d0c5c9a6a07b3f57416a7ef" exitCode=0 Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.490232 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" event={"ID":"576b5a44-3c4c-4905-8d89-caed3b1eb43f","Type":"ContainerDied","Data":"5eb2173f2f321ecef00d9ddbca74e917d0284c4b6d0c5c9a6a07b3f57416a7ef"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.496919 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" event={"ID":"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d","Type":"ContainerStarted","Data":"2c61b8b5174b73d10a499a6df7d2119e9ef1c1363920d5bf989b781c1adeee7f"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.504053 4732 generic.go:334] "Generic (PLEG): container finished" podID="81b523ca-b564-45d4-bad5-f7e236f2e6d0" containerID="859dd7d18198cae3e59b1016af2e0383557984484498ef61411a009cd59679f4" exitCode=0 Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.504173 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" event={"ID":"81b523ca-b564-45d4-bad5-f7e236f2e6d0","Type":"ContainerDied","Data":"859dd7d18198cae3e59b1016af2e0383557984484498ef61411a009cd59679f4"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.515547 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-bzk95"] Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.522919 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" event={"ID":"62bbd0d6-eba8-4737-9608-3f7a3dd6a157","Type":"ContainerStarted","Data":"874ed60b00db00813c1358ba0365c34806d02993922ccb884b227b0615d530da"} Jan 31 09:03:26 crc kubenswrapper[4732]: W0131 09:03:26.534020 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod058f5386_f340_4a52_bfc8_9b1a60515c9b.slice/crio-6fd5aabbe2ed81a7a78020aa8ca21834975dae45d93b7e0645ab08ab12f1815f WatchSource:0}: Error finding container 6fd5aabbe2ed81a7a78020aa8ca21834975dae45d93b7e0645ab08ab12f1815f: Status 404 returned error can't find the container with id 6fd5aabbe2ed81a7a78020aa8ca21834975dae45d93b7e0645ab08ab12f1815f Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.534875 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ncqs4" podStartSLOduration=119.534852131 podStartE2EDuration="1m59.534852131s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:26.534406446 +0000 UTC m=+144.840282660" watchObservedRunningTime="2026-01-31 09:03:26.534852131 +0000 UTC m=+144.840728335" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.576604 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.577521 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.077503427 +0000 UTC m=+145.383379631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.590280 4732 generic.go:334] "Generic (PLEG): container finished" podID="6377a401-b10b-455a-8906-f6706302b91f" containerID="75d4dfe8017617c8bc39777fe7f724ffac557d97a7569ca193cf719ee8c8ddf5" exitCode=0 Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.595468 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" event={"ID":"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20","Type":"ContainerStarted","Data":"bf0aacb740607afdcd33e43432dcaec43c8aa3d7707aec7cab5cbf845309020a"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.595535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" event={"ID":"6377a401-b10b-455a-8906-f6706302b91f","Type":"ContainerDied","Data":"75d4dfe8017617c8bc39777fe7f724ffac557d97a7569ca193cf719ee8c8ddf5"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.595557 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kgqfp"] Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.639012 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dv6hv" podStartSLOduration=119.638991616 podStartE2EDuration="1m59.638991616s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:26.638151477 +0000 UTC m=+144.944027691" watchObservedRunningTime="2026-01-31 09:03:26.638991616 +0000 UTC m=+144.944867820" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.680216 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" event={"ID":"edb14eaf-7738-4139-9b1b-9557e7e37ffc","Type":"ContainerStarted","Data":"e184ef399f504c65a18a1317d6c06296c3275341759143fd1ecc5c11e6dab7ea"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.680289 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" event={"ID":"edb14eaf-7738-4139-9b1b-9557e7e37ffc","Type":"ContainerStarted","Data":"6b2deccfb5f751a56925281b353e90624d7859e638863238b0271bd0f7231b87"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.694748 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.694967 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.194926178 +0000 UTC m=+145.500802392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.702086 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.706399 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f78bs" event={"ID":"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d","Type":"ContainerStarted","Data":"ecab75dc9cd7bd70c1ab9df8ab84ea76f66b4003ed17c9e90bec058923db5a0b"} Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.707998 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.207976327 +0000 UTC m=+145.513852591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.711019 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" event={"ID":"541ea3c2-891c-4c3e-81fd-9d340112c62b","Type":"ContainerStarted","Data":"e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.711987 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.755070 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" event={"ID":"8cc29c02-baeb-4f46-92d6-684343509ae1","Type":"ContainerStarted","Data":"028c79adb4156659acd73bab76af06e2c4665808280d16be70d725ea649b82b3"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.758118 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" event={"ID":"639dacb9-2ea3-49d2-b5c4-996992c8e16a","Type":"ContainerStarted","Data":"fb874cab7660d02215feda5562a2f58f952b7ed4ab3d6f1a75632fb183acbe30"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.774165 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" event={"ID":"e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a","Type":"ContainerStarted","Data":"6eca15ab7357874e4edb05c74f358f8f33b6cebbec102a9396211388e37d5da1"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.782780 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h5q9f" event={"ID":"8814e7c8-5104-40f7-9761-4feedc15697b","Type":"ContainerStarted","Data":"810a7219fbe0602d94b66784880ef250eec1b53ec5dbf8b9e30ef55ffb36b889"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.787704 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt2jr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.788036 4732 patch_prober.go:28] interesting pod/console-operator-58897d9998-76d6v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.788079 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-76d6v" podUID="499830ff-8add-4caf-b469-d1cbde569fb7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.788166 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt2jr" podUID="81e1781e-a935-4f3f-b2aa-9a0807f43c73" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.787702 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" event={"ID":"219a04b6-e7bd-4138-bcc7-4f650537aa24","Type":"ContainerStarted","Data":"45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481"} Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.788216 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.792645 4732 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tg4xc container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.792704 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" podUID="219a04b6-e7bd-4138-bcc7-4f650537aa24" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.810199 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.811790 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.31176417 +0000 UTC m=+145.617640374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.898766 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-rt2jr" podStartSLOduration=119.898742207 podStartE2EDuration="1m59.898742207s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:26.898681465 +0000 UTC m=+145.204557669" watchObservedRunningTime="2026-01-31 09:03:26.898742207 +0000 UTC m=+145.204618411" Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.912643 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:26 crc kubenswrapper[4732]: E0131 09:03:26.916388 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.41636648 +0000 UTC m=+145.722242734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:26 crc kubenswrapper[4732]: I0131 09:03:26.936732 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq"] Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.001879 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.003828 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.003884 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.015453 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.016223 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.516199429 +0000 UTC m=+145.822075623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.123747 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.124205 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.624191643 +0000 UTC m=+145.930067847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.125235 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ljds4"] Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.128765 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-wqf9f"] Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.165111 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-76d6v" podStartSLOduration=120.16508708 podStartE2EDuration="2m0.16508708s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:27.164962386 +0000 UTC m=+145.470838600" watchObservedRunningTime="2026-01-31 09:03:27.16508708 +0000 UTC m=+145.470963284" Jan 31 09:03:27 crc kubenswrapper[4732]: W0131 09:03:27.178406 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d02584d_db7d_4bc0_8cd9_33081993309b.slice/crio-26a7947af87c367b8b5fd8967811355f2031ef1361fc3f3dc4ec465f3ec37d25 WatchSource:0}: Error finding container 26a7947af87c367b8b5fd8967811355f2031ef1361fc3f3dc4ec465f3ec37d25: Status 404 returned error can't find the container with id 26a7947af87c367b8b5fd8967811355f2031ef1361fc3f3dc4ec465f3ec37d25 Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.187720 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.213355 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-r9cz8" podStartSLOduration=120.213334353 podStartE2EDuration="2m0.213334353s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:27.213205849 +0000 UTC m=+145.519082043" watchObservedRunningTime="2026-01-31 09:03:27.213334353 +0000 UTC m=+145.519210557" Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.228518 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.228876 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.728856195 +0000 UTC m=+146.034732399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.278992 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7"] Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.300597 4732 csr.go:261] certificate signing request csr-jpx96 is approved, waiting to be issued Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.336300 4732 csr.go:257] certificate signing request csr-jpx96 is issued Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.338331 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.338626 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.838612819 +0000 UTC m=+146.144489023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.438952 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.439166 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.939127302 +0000 UTC m=+146.245003506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.439317 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.439850 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:27.939841966 +0000 UTC m=+146.245718170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: W0131 09:03:27.481719 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d0ed50_aa9b_4a62_b340_882ddf73f008.slice/crio-94900506e78a6c467e226df2c51bd0deeed7b70de0ce789b914b8672f3e90a32 WatchSource:0}: Error finding container 94900506e78a6c467e226df2c51bd0deeed7b70de0ce789b914b8672f3e90a32: Status 404 returned error can't find the container with id 94900506e78a6c467e226df2c51bd0deeed7b70de0ce789b914b8672f3e90a32 Jan 31 09:03:27 crc kubenswrapper[4732]: W0131 09:03:27.516748 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b99fdc_6d61_46e1_b093_b1b92efce54c.slice/crio-d823b2f2259fe7403df9d95229c6812fa829c22821d540565c0441a8e495dd3f WatchSource:0}: Error finding container d823b2f2259fe7403df9d95229c6812fa829c22821d540565c0441a8e495dd3f: Status 404 returned error can't find the container with id d823b2f2259fe7403df9d95229c6812fa829c22821d540565c0441a8e495dd3f Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.541177 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.541623 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:28.041590649 +0000 UTC m=+146.347466853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.560217 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8t8ks" podStartSLOduration=120.560195806 podStartE2EDuration="2m0.560195806s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:27.558039693 +0000 UTC m=+145.863915907" watchObservedRunningTime="2026-01-31 09:03:27.560195806 +0000 UTC m=+145.866072010" Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.647128 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.657273 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:28.157250432 +0000 UTC m=+146.463126626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.758175 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc"] Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.769044 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.770335 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:28.269643874 +0000 UTC m=+146.575520078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.776680 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg"] Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.871416 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.871850 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:28.371835473 +0000 UTC m=+146.677711677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.878710 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" event={"ID":"8cc29c02-baeb-4f46-92d6-684343509ae1","Type":"ContainerStarted","Data":"994f37a34067d24060075614a4201810c77b264e2851f8972bb66f0402c0b01b"} Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.897769 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pxn6w" podStartSLOduration=120.897751345 podStartE2EDuration="2m0.897751345s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:27.896052648 +0000 UTC m=+146.201928852" watchObservedRunningTime="2026-01-31 09:03:27.897751345 +0000 UTC m=+146.203627539" Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.917722 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg"] Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.917999 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq"] Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.931524 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" event={"ID":"12b99fdc-6d61-46e1-b093-b1b92efce54c","Type":"ContainerStarted","Data":"d823b2f2259fe7403df9d95229c6812fa829c22821d540565c0441a8e495dd3f"} Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.958712 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" event={"ID":"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20","Type":"ContainerStarted","Data":"9615c331134f8617d35092f6d2eb0dd4c5eead219bbc1b139774acc1bdb42b9b"} Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.959223 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.978409 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.978968 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh"] Jan 31 09:03:27 crc kubenswrapper[4732]: E0131 09:03:27.980520 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:28.480483479 +0000 UTC m=+146.786359693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:27 crc kubenswrapper[4732]: I0131 09:03:27.989322 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-c7j22"] Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.002420 4732 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-c8t6l container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.002489 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" podUID="c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.002637 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dmhxf" event={"ID":"d64c27a7-b418-450e-9067-dde0cd145597","Type":"ContainerStarted","Data":"1ff9dd2a96d576f89512e1b8be3651e306b1ff8b52addc0f38048e9e4c6976ed"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.013086 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:28 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:28 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:28 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.013149 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.024801 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n"] Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.040839 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-54nxd" podStartSLOduration=121.04082189 podStartE2EDuration="2m1.04082189s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.03935069 +0000 UTC m=+146.345226894" watchObservedRunningTime="2026-01-31 09:03:28.04082189 +0000 UTC m=+146.346698094" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.069513 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h5q9f" event={"ID":"8814e7c8-5104-40f7-9761-4feedc15697b","Type":"ContainerStarted","Data":"8e2e47330a0cff60d343fdebcddbd68f81d61614c46ac5de056954f059801465"} Jan 31 09:03:28 crc kubenswrapper[4732]: E0131 09:03:28.091258 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:28.591240176 +0000 UTC m=+146.897116380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.083525 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.113149 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" event={"ID":"1d02584d-db7d-4bc0-8cd9-33081993309b","Type":"ContainerStarted","Data":"26a7947af87c367b8b5fd8967811355f2031ef1361fc3f3dc4ec465f3ec37d25"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.131740 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" event={"ID":"f4d0ed50-aa9b-4a62-b340-882ddf73f008","Type":"ContainerStarted","Data":"94900506e78a6c467e226df2c51bd0deeed7b70de0ce789b914b8672f3e90a32"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.155724 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hn5wx"] Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.162902 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" podStartSLOduration=121.162876857 podStartE2EDuration="2m1.162876857s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.149633111 +0000 UTC m=+146.455509315" watchObservedRunningTime="2026-01-31 09:03:28.162876857 +0000 UTC m=+146.468753071" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.165339 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr"] Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.175924 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" event={"ID":"639dacb9-2ea3-49d2-b5c4-996992c8e16a","Type":"ContainerStarted","Data":"7d3480f4669b746b7e3f02ebc7e7cdd9677524e312fd46c6cd599eb58e036199"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.179305 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" event={"ID":"075f442e-a691-4856-a6ea-e21f1dcbcb20","Type":"ContainerStarted","Data":"93f93c2ee3a222337228cf7e2b94fe95283c70b8a31b8ac3338cb37ac5ae12b2"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.186095 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f78bs" event={"ID":"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d","Type":"ContainerStarted","Data":"1c48a0fa383a2c663b8e2a2ce2e4a20e47f5f248bbe6f7cd0bb0f7fbbd874e52"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.194345 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:28 crc kubenswrapper[4732]: E0131 09:03:28.195564 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:28.695534206 +0000 UTC m=+147.001410470 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.203825 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" event={"ID":"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d","Type":"ContainerStarted","Data":"454505ef33a0f64d3a5a9bf57d498ac32d53fa4b6eeead1b0816b17db84943d1"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.234017 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fsss9" podStartSLOduration=121.23399667 podStartE2EDuration="2m1.23399667s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.194638686 +0000 UTC m=+146.500514890" watchObservedRunningTime="2026-01-31 09:03:28.23399667 +0000 UTC m=+146.539872874" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.239905 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7"] Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.246147 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" event={"ID":"2325b276-ee4d-438d-b9d6-d7de3024ba96","Type":"ContainerStarted","Data":"c4a0c81d27e168dda742e6b89de4c9bf7097f998145340484ac2a86979fbc824"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.265033 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" event={"ID":"058f5386-f340-4a52-bfc8-9b1a60515c9b","Type":"ContainerStarted","Data":"6fd5aabbe2ed81a7a78020aa8ca21834975dae45d93b7e0645ab08ab12f1815f"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.266101 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.267110 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vkrgj" podStartSLOduration=121.267092014 podStartE2EDuration="2m1.267092014s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.250839267 +0000 UTC m=+146.556715471" watchObservedRunningTime="2026-01-31 09:03:28.267092014 +0000 UTC m=+146.572968218" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.268504 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8"] Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.275009 4732 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7fqsl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.275051 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" podUID="058f5386-f340-4a52-bfc8-9b1a60515c9b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 31 09:03:28 crc kubenswrapper[4732]: W0131 09:03:28.283696 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0558933a_c8d6_45dc_aeaf_af86190b15a0.slice/crio-489c0c3b171c6d96e855fb1f2464c3d1486c21baf174589add9247f92fff3bcc WatchSource:0}: Error finding container 489c0c3b171c6d96e855fb1f2464c3d1486c21baf174589add9247f92fff3bcc: Status 404 returned error can't find the container with id 489c0c3b171c6d96e855fb1f2464c3d1486c21baf174589add9247f92fff3bcc Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.283963 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" event={"ID":"4149606a-3fbc-4da9-ba05-dc473b492a89","Type":"ContainerStarted","Data":"0d603fb2b88bc584de7719a4094d2d7f44ba3f412750957a8c4a44d9d9f4e109"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.294578 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" podStartSLOduration=121.294550048 podStartE2EDuration="2m1.294550048s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.291179694 +0000 UTC m=+146.597055888" watchObservedRunningTime="2026-01-31 09:03:28.294550048 +0000 UTC m=+146.600426252" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.296585 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:28 crc kubenswrapper[4732]: E0131 09:03:28.300376 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:28.800357783 +0000 UTC m=+147.106234057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.307128 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" event={"ID":"33169e52-3fee-462c-b341-46563ddbf5aa","Type":"ContainerStarted","Data":"03c6183f62903179b889f746896931dc6c474b57a0bc50364cf6b9de2e1aa3a8"} Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.318891 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.347355 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-31 08:58:27 +0000 UTC, rotation deadline is 2026-11-30 21:28:02.735812286 +0000 UTC Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.347402 4732 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7284h24m34.38841404s for next certificate rotation Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.370182 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-h5q9f" podStartSLOduration=121.370157272 podStartE2EDuration="2m1.370157272s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.32789324 +0000 UTC m=+146.633769444" watchObservedRunningTime="2026-01-31 09:03:28.370157272 +0000 UTC m=+146.676033486" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.400073 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:28 crc kubenswrapper[4732]: E0131 09:03:28.409768 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:28.909726994 +0000 UTC m=+147.215603238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:28 crc kubenswrapper[4732]: W0131 09:03:28.418372 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3302e69_0f73_4974_a8ac_af1992933147.slice/crio-f53f4c1e20390b55703a4f25af2363cf2ce342ac193b7fd1994de4cad5ba828e WatchSource:0}: Error finding container f53f4c1e20390b55703a4f25af2363cf2ce342ac193b7fd1994de4cad5ba828e: Status 404 returned error can't find the container with id f53f4c1e20390b55703a4f25af2363cf2ce342ac193b7fd1994de4cad5ba828e Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.422318 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" podStartSLOduration=121.422296007 podStartE2EDuration="2m1.422296007s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.420985483 +0000 UTC m=+146.726861697" watchObservedRunningTime="2026-01-31 09:03:28.422296007 +0000 UTC m=+146.728172211" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.422823 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sf8wd" podStartSLOduration=121.422816694 podStartE2EDuration="2m1.422816694s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.371073153 +0000 UTC m=+146.676949357" watchObservedRunningTime="2026-01-31 09:03:28.422816694 +0000 UTC m=+146.728692908" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.490483 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" podStartSLOduration=121.49045215 podStartE2EDuration="2m1.49045215s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.482442531 +0000 UTC m=+146.788318735" watchObservedRunningTime="2026-01-31 09:03:28.49045215 +0000 UTC m=+146.796328354" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.502918 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:28 crc kubenswrapper[4732]: E0131 09:03:28.514118 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.014093356 +0000 UTC m=+147.319969560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.544905 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" podStartSLOduration=121.544880632 podStartE2EDuration="2m1.544880632s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.517127408 +0000 UTC m=+146.823003612" watchObservedRunningTime="2026-01-31 09:03:28.544880632 +0000 UTC m=+146.850756836" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.611319 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:28 crc kubenswrapper[4732]: E0131 09:03:28.611925 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.111900548 +0000 UTC m=+147.417776752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.612470 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.618051 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" podStartSLOduration=121.618025114 podStartE2EDuration="2m1.618025114s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.571479517 +0000 UTC m=+146.877355721" watchObservedRunningTime="2026-01-31 09:03:28.618025114 +0000 UTC m=+146.923901328" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.628177 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" podStartSLOduration=121.628157804 podStartE2EDuration="2m1.628157804s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.627536303 +0000 UTC m=+146.933412527" watchObservedRunningTime="2026-01-31 09:03:28.628157804 +0000 UTC m=+146.934034008" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.716011 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:28 crc kubenswrapper[4732]: E0131 09:03:28.716618 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.216601791 +0000 UTC m=+147.522478005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.744492 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" podStartSLOduration=121.744471029 podStartE2EDuration="2m1.744471029s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:28.695294894 +0000 UTC m=+147.001171108" watchObservedRunningTime="2026-01-31 09:03:28.744471029 +0000 UTC m=+147.050347243" Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.819217 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:28 crc kubenswrapper[4732]: E0131 09:03:28.819701 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.319648659 +0000 UTC m=+147.625524863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:28 crc kubenswrapper[4732]: I0131 09:03:28.920517 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:28 crc kubenswrapper[4732]: E0131 09:03:28.923616 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.423593447 +0000 UTC m=+147.729469651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.000370 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:29 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:29 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:29 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.000451 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.024294 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.024781 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.52475605 +0000 UTC m=+147.830632264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.126506 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.126980 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.62696257 +0000 UTC m=+147.932838764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.227384 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.227651 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.727609037 +0000 UTC m=+148.033485241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.227769 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.228139 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.728132085 +0000 UTC m=+148.034008279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.311912 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" event={"ID":"6377a401-b10b-455a-8906-f6706302b91f","Type":"ContainerStarted","Data":"d2181f4f5d1e2146e81d69f051730329e27e7743031a7694aa2c8da14c5cc18e"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.312789 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" event={"ID":"498d64fc-0d0f-43c6-aaae-bd3c5f0d7873","Type":"ContainerStarted","Data":"5a17f505d8d7fe1e77ceea12d0e357ef7e3a6e284431fc8a4213a4bb74cfe86b"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.313610 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" event={"ID":"6a680afa-dc56-4bf8-808c-b1c947c8fbf0","Type":"ContainerStarted","Data":"a77c8864885f592214013bdaabbf1546a4bf3b9d551e666ca93aa4b7dc47c77e"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.315072 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" event={"ID":"12b99fdc-6d61-46e1-b093-b1b92efce54c","Type":"ContainerStarted","Data":"70129bb72d9d428fb9c6b6a67de54e5e851c89d46ebba9123286e0fb055d052f"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.316358 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" event={"ID":"1d02584d-db7d-4bc0-8cd9-33081993309b","Type":"ContainerStarted","Data":"cb75563427d906c4558da5925381465530a10f0cf7883273950e99d8bbdb381e"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.316596 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.317809 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" event={"ID":"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54","Type":"ContainerStarted","Data":"6e978d92928544f375cf8cf096f96403f6dc278ebb6db452f3cb460900306e85"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.318951 4732 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wb8wq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.319049 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" podUID="1d02584d-db7d-4bc0-8cd9-33081993309b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.319636 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" event={"ID":"576b5a44-3c4c-4905-8d89-caed3b1eb43f","Type":"ContainerStarted","Data":"1204df07b0546c41d29b3218ad1ab3082ea47963a61ee256e5d242fba34c34cd"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.319851 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.321158 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dmhxf" event={"ID":"d64c27a7-b418-450e-9067-dde0cd145597","Type":"ContainerStarted","Data":"b505ca3ae676829a4263a0294c61e3ce507d8d59a73dcb7b8123dc58c64a5387"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.322405 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" event={"ID":"7c0ada8b-e2dc-418c-a43e-33789285388f","Type":"ContainerStarted","Data":"7cddae12a36276f7679a3410a2b60becc3d653fa6d431d86451ada4cde11b528"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.323177 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" event={"ID":"a3302e69-0f73-4974-a8ac-af1992933147","Type":"ContainerStarted","Data":"f53f4c1e20390b55703a4f25af2363cf2ce342ac193b7fd1994de4cad5ba828e"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.323938 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" event={"ID":"ffad13f5-fb20-46ac-b886-c7e5a29b6599","Type":"ContainerStarted","Data":"507a7d36ca763a7df7bff24d356848728d6fe563b1f52cd5ce5ddd06de33b76f"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.325052 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-wqf9f" event={"ID":"4149606a-3fbc-4da9-ba05-dc473b492a89","Type":"ContainerStarted","Data":"cf5b5693469538faad28bf091a05f257185982cfc29214364d426b1d986730fa"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.327332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" event={"ID":"81b523ca-b564-45d4-bad5-f7e236f2e6d0","Type":"ContainerStarted","Data":"b7972d03a35b0da756ebcf27d3b3d54065a50eb2f5fcb04291c30cf3451e300a"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.328521 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.328647 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.828623587 +0000 UTC m=+148.134499791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.328788 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.329113 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.829104383 +0000 UTC m=+148.134980587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.330156 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" event={"ID":"8cc29c02-baeb-4f46-92d6-684343509ae1","Type":"ContainerStarted","Data":"9a3719522a375ad4ac073e52e792f10112cb06992301dca226a8ef16512b0dc2"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.333812 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-bzk95" event={"ID":"33169e52-3fee-462c-b341-46563ddbf5aa","Type":"ContainerStarted","Data":"e1e4c6691134cf8196fe8b794283b617486b1dc7ccf0e5b6545094033c51a1b1"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.335272 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" event={"ID":"f4d0ed50-aa9b-4a62-b340-882ddf73f008","Type":"ContainerStarted","Data":"393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.335559 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.337308 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-44wvz" event={"ID":"64e4fed2-f31d-4d3f-ad77-d55fdddacb4d","Type":"ContainerStarted","Data":"21db909e84f6fe0c354f0ddda90c23f8e0f3355d8c0cb038af1a5dfc0d26905f"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.337776 4732 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ljds4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.337819 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" podUID="f4d0ed50-aa9b-4a62-b340-882ddf73f008" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.338471 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hn5wx" event={"ID":"1e6408b2-d73d-4fc3-86f7-d29ea59ad32e","Type":"ContainerStarted","Data":"6ebf52ace6b30eaa61e0b5dbea6b27dd704ea4b0e58a9dcb3b1bc45908bfc7e8"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.339457 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" event={"ID":"0558933a-c8d6-45dc-aeaf-af86190b15a0","Type":"ContainerStarted","Data":"489c0c3b171c6d96e855fb1f2464c3d1486c21baf174589add9247f92fff3bcc"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.340639 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" event={"ID":"2325b276-ee4d-438d-b9d6-d7de3024ba96","Type":"ContainerStarted","Data":"2d86d760940b52671a07e286595f0df253a255586fad28ce45ad4e3a40ab2321"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.342488 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zcd2h" event={"ID":"075f442e-a691-4856-a6ea-e21f1dcbcb20","Type":"ContainerStarted","Data":"78a1039648041eacf8bd25e47005ea59bc47f6eabe7164523054ff8e894db943"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.345513 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" event={"ID":"caaa7607-b47d-43ca-adff-f9135baf7262","Type":"ContainerStarted","Data":"bcb59bdab6a65b50e8af721cc78a2babde93820c9d46e0d1ce242835778998fe"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.346371 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr" event={"ID":"56a9a21e-28d2-4386-9fe0-947c3a39ab6a","Type":"ContainerStarted","Data":"141c720196fa8b0005af28800d8defeba6bcedcff8b7e295be2ec049640f0f68"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.347743 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" event={"ID":"058f5386-f340-4a52-bfc8-9b1a60515c9b","Type":"ContainerStarted","Data":"24bbf9f597251423574db6ae22dfec4ef72223af8e75901547c2b1b66b0d7aa1"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.348316 4732 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7fqsl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.348370 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" podUID="058f5386-f340-4a52-bfc8-9b1a60515c9b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.348794 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c7j22" event={"ID":"89cec875-cd1f-4867-8b4b-ca72c57c974b","Type":"ContainerStarted","Data":"dd02a6b4499666b4dd59125116e69274dc420104128afb9d750cd6a10e206e56"} Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.355803 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xgh7" podStartSLOduration=122.35578203 podStartE2EDuration="2m2.35578203s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:29.352696226 +0000 UTC m=+147.658572430" watchObservedRunningTime="2026-01-31 09:03:29.35578203 +0000 UTC m=+147.661658234" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.373435 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" podStartSLOduration=122.373413694 podStartE2EDuration="2m2.373413694s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:29.371031114 +0000 UTC m=+147.676907318" watchObservedRunningTime="2026-01-31 09:03:29.373413694 +0000 UTC m=+147.679289898" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.385078 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dmhxf" podStartSLOduration=6.385057776 podStartE2EDuration="6.385057776s" podCreationTimestamp="2026-01-31 09:03:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:29.383533244 +0000 UTC m=+147.689409448" watchObservedRunningTime="2026-01-31 09:03:29.385057776 +0000 UTC m=+147.690933980" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.400448 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" podStartSLOduration=122.400422342 podStartE2EDuration="2m2.400422342s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:29.400371471 +0000 UTC m=+147.706247675" watchObservedRunningTime="2026-01-31 09:03:29.400422342 +0000 UTC m=+147.706298566" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.420906 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vpnm5" podStartSLOduration=122.420887391 podStartE2EDuration="2m2.420887391s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:29.418306025 +0000 UTC m=+147.724182229" watchObservedRunningTime="2026-01-31 09:03:29.420887391 +0000 UTC m=+147.726763595" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.429438 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.431048 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:29.931014082 +0000 UTC m=+148.236890286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.447547 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" podStartSLOduration=122.447519547 podStartE2EDuration="2m2.447519547s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:29.437703837 +0000 UTC m=+147.743580041" watchObservedRunningTime="2026-01-31 09:03:29.447519547 +0000 UTC m=+147.753395751" Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.532248 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.532763 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.032737855 +0000 UTC m=+148.338614099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.634311 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.634438 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.134406637 +0000 UTC m=+148.440282841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.634592 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.634997 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.134987236 +0000 UTC m=+148.440863440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.736030 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.736179 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.23615187 +0000 UTC m=+148.542028084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.736378 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.736732 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.23672321 +0000 UTC m=+148.542599414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.837732 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.837976 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.337924505 +0000 UTC m=+148.643800729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.838045 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.838449 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.338432213 +0000 UTC m=+148.644308477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:29 crc kubenswrapper[4732]: I0131 09:03:29.939840 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:29 crc kubenswrapper[4732]: E0131 09:03:29.940294 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.440267349 +0000 UTC m=+148.746143553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.001707 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:30 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:30 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:30 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.002137 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.041460 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.042042 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.542023974 +0000 UTC m=+148.847900178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.045436 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.143294 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.143567 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.64354427 +0000 UTC m=+148.949420474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.143914 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.144483 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.644466402 +0000 UTC m=+148.950342616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.245408 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.245693 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.745655336 +0000 UTC m=+149.051531540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.246080 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.246620 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.746605408 +0000 UTC m=+149.052481612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.273060 4732 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-hzs92 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.273133 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" podUID="576b5a44-3c4c-4905-8d89-caed3b1eb43f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.273459 4732 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-hzs92 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.273486 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" podUID="576b5a44-3c4c-4905-8d89-caed3b1eb43f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.347677 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.347921 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.847880117 +0000 UTC m=+149.153756321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.348018 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.348456 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.848448486 +0000 UTC m=+149.154324690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.355349 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" event={"ID":"0558933a-c8d6-45dc-aeaf-af86190b15a0","Type":"ContainerStarted","Data":"940d163c1a1a494d9850589a935618158e47c219b1ef2186264ecbca1a2bfccc"} Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.356713 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" event={"ID":"a3302e69-0f73-4974-a8ac-af1992933147","Type":"ContainerStarted","Data":"ea4607fbdadbd1827d030b75908c08b9ba649762448baddf7ff8d2ecb248d499"} Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.358262 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f78bs" event={"ID":"eb80e0fc-6378-4d0d-a8b0-6c662073ed4d","Type":"ContainerStarted","Data":"bdbc34129773068cda5d67136cc28839379cfeca6c7947eca1f9de790cf604f0"} Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.359859 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" event={"ID":"498d64fc-0d0f-43c6-aaae-bd3c5f0d7873","Type":"ContainerStarted","Data":"23414bc19e006fd97e8b4fd01854cd2b5641c26df1de7e8f19251db919576ffa"} Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.361505 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" event={"ID":"6a680afa-dc56-4bf8-808c-b1c947c8fbf0","Type":"ContainerStarted","Data":"a3862c80161167518eda88d0fd349597505cb098b09e89ba3305d1bc84ea0ed4"} Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.365330 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" event={"ID":"caaa7607-b47d-43ca-adff-f9135baf7262","Type":"ContainerStarted","Data":"aa489e5b1d7de97707c97d8c45409de9509330430d456899f2d0fe83a1a6bb34"} Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.368508 4732 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wb8wq container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.368548 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" podUID="1d02584d-db7d-4bc0-8cd9-33081993309b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.368619 4732 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7fqsl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.368727 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" podUID="058f5386-f340-4a52-bfc8-9b1a60515c9b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.369035 4732 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ljds4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.369160 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" podUID="f4d0ed50-aa9b-4a62-b340-882ddf73f008" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.369189 4732 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-hzs92 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.369307 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" podUID="576b5a44-3c4c-4905-8d89-caed3b1eb43f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": dial tcp 10.217.0.9:8443: connect: connection refused" Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.390275 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" podStartSLOduration=123.390248292 podStartE2EDuration="2m3.390248292s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:30.387978626 +0000 UTC m=+148.693854840" watchObservedRunningTime="2026-01-31 09:03:30.390248292 +0000 UTC m=+148.696124496" Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.449455 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.449882 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.949834527 +0000 UTC m=+149.255710731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.450560 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.453360 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:30.953340965 +0000 UTC m=+149.259217169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.551476 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.551833 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.051801629 +0000 UTC m=+149.357677833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.552031 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.552381 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.052369498 +0000 UTC m=+149.358245752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.653012 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.653186 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.15315893 +0000 UTC m=+149.459035144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.653392 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.653779 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.15376843 +0000 UTC m=+149.459644634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.754227 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.754586 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.254545062 +0000 UTC m=+149.560421266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.755925 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.756323 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.256304991 +0000 UTC m=+149.562181195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.856949 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.857257 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.357218687 +0000 UTC m=+149.663094891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.857615 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.858145 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.358134098 +0000 UTC m=+149.664010382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:30 crc kubenswrapper[4732]: I0131 09:03:30.959107 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:30 crc kubenswrapper[4732]: E0131 09:03:30.959501 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.459473277 +0000 UTC m=+149.765349481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.008086 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:31 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:31 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:31 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.008141 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.060833 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.061315 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.561289614 +0000 UTC m=+149.867165878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.162500 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.162709 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.662678186 +0000 UTC m=+149.968554390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.163113 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.163491 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.663474263 +0000 UTC m=+149.969350467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.264492 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.264973 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.764940737 +0000 UTC m=+150.070816941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.365707 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.366066 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.86604972 +0000 UTC m=+150.171925924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.371443 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" event={"ID":"a3302e69-0f73-4974-a8ac-af1992933147","Type":"ContainerStarted","Data":"cc667e0b4ea4ba9099055746ce1ef40fa9168b20ed76dbc45ac033af2550051e"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.373321 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr" event={"ID":"56a9a21e-28d2-4386-9fe0-947c3a39ab6a","Type":"ContainerStarted","Data":"0506ed446713bba7d913d7d72f36bde9335138ec24ae6b3717bdf39d20acf732"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.373357 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr" event={"ID":"56a9a21e-28d2-4386-9fe0-947c3a39ab6a","Type":"ContainerStarted","Data":"f00ffe2f6acf44cc4cbf8681079007ae91a44bb594caae37cfb5570d3245b593"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.387057 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" event={"ID":"e27a7ff4-a3a6-4654-99f7-6ec9f49c7c54","Type":"ContainerStarted","Data":"581a575d235456c51c60b0db035de8da48e1e100b41c01fa2984b0556d283d08"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.389158 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" event={"ID":"ffad13f5-fb20-46ac-b886-c7e5a29b6599","Type":"ContainerStarted","Data":"eb562913e37b3353a62315a1c03b58ee0a0a272fd3c3c7d9ebef244cd90fec26"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.396404 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" event={"ID":"81b523ca-b564-45d4-bad5-f7e236f2e6d0","Type":"ContainerStarted","Data":"dcf51ba99369929b60b832aeb0040a500a30af59a5aa23dcaf0a87e6cdbdcdfe"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.407365 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" event={"ID":"caaa7607-b47d-43ca-adff-f9135baf7262","Type":"ContainerStarted","Data":"6ce469881e2193c514b734acd51a7186ea1f7ba0f20a682ec4474843899665da"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.408281 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.412653 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" event={"ID":"7c0ada8b-e2dc-418c-a43e-33789285388f","Type":"ContainerStarted","Data":"65fa56cb474e6e8c9ba944f7c61b1f51bc1fb2db8f8aee8f15e97de794c94fe3"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.412712 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" event={"ID":"7c0ada8b-e2dc-418c-a43e-33789285388f","Type":"ContainerStarted","Data":"c9d1fc71e4bc47fd6bad285937064829789d26f89fdac6334266068e64e95ab7"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.416191 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hn5wx" event={"ID":"1e6408b2-d73d-4fc3-86f7-d29ea59ad32e","Type":"ContainerStarted","Data":"c831ebe45cc7b88a8df1d753121060ac99849b8dbf30699862e6b9585af6a694"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.423369 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-llpv8" podStartSLOduration=124.423345018 podStartE2EDuration="2m4.423345018s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.402848738 +0000 UTC m=+149.708724972" watchObservedRunningTime="2026-01-31 09:03:31.423345018 +0000 UTC m=+149.729221222" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.429451 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" event={"ID":"2325b276-ee4d-438d-b9d6-d7de3024ba96","Type":"ContainerStarted","Data":"8092decd876b3c15cc891d582d26dbb6da04516ad5cea486a1dfa3f8d178a80a"} Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.429500 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.430594 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.438971 4732 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6fwnh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" start-of-body= Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.439080 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" podUID="6a680afa-dc56-4bf8-808c-b1c947c8fbf0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": dial tcp 10.217.0.25:5443: connect: connection refused" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.443103 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4qmgq" podStartSLOduration=124.443083712 podStartE2EDuration="2m4.443083712s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.420935667 +0000 UTC m=+149.726811881" watchObservedRunningTime="2026-01-31 09:03:31.443083712 +0000 UTC m=+149.748959916" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.443952 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bpncg" podStartSLOduration=124.443947091 podStartE2EDuration="2m4.443947091s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.441631443 +0000 UTC m=+149.747507647" watchObservedRunningTime="2026-01-31 09:03:31.443947091 +0000 UTC m=+149.749823295" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.467526 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.467740 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.467984 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.468902 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:31.96887904 +0000 UTC m=+150.274755244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.476817 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.483301 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" podStartSLOduration=124.483278474 podStartE2EDuration="2m4.483278474s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.482089335 +0000 UTC m=+149.787965549" watchObservedRunningTime="2026-01-31 09:03:31.483278474 +0000 UTC m=+149.789154678" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.484309 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.503622 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" podStartSLOduration=124.503600488 podStartE2EDuration="2m4.503600488s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.500195994 +0000 UTC m=+149.806072198" watchObservedRunningTime="2026-01-31 09:03:31.503600488 +0000 UTC m=+149.809476693" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.519966 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" podStartSLOduration=124.519904567 podStartE2EDuration="2m4.519904567s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.519011108 +0000 UTC m=+149.824887312" watchObservedRunningTime="2026-01-31 09:03:31.519904567 +0000 UTC m=+149.825780771" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.536077 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f78bs" podStartSLOduration=9.53603972 podStartE2EDuration="9.53603972s" podCreationTimestamp="2026-01-31 09:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.534807519 +0000 UTC m=+149.840683733" watchObservedRunningTime="2026-01-31 09:03:31.53603972 +0000 UTC m=+149.841915924" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.553924 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" podStartSLOduration=124.553901591 podStartE2EDuration="2m4.553901591s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.550761885 +0000 UTC m=+149.856638089" watchObservedRunningTime="2026-01-31 09:03:31.553901591 +0000 UTC m=+149.859777795" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.570532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.570940 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.570979 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.070957745 +0000 UTC m=+150.376834039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.571090 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.579222 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.573635 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v69mc" podStartSLOduration=124.573616475 podStartE2EDuration="2m4.573616475s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.569043791 +0000 UTC m=+149.874919995" watchObservedRunningTime="2026-01-31 09:03:31.573616475 +0000 UTC m=+149.879492689" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.581491 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.584705 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4sk2n" podStartSLOduration=124.584681377 podStartE2EDuration="2m4.584681377s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.58358842 +0000 UTC m=+149.889464624" watchObservedRunningTime="2026-01-31 09:03:31.584681377 +0000 UTC m=+149.890557581" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.614496 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hn5wx" podStartSLOduration=9.614472219 podStartE2EDuration="9.614472219s" podCreationTimestamp="2026-01-31 09:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.610138694 +0000 UTC m=+149.916014908" watchObservedRunningTime="2026-01-31 09:03:31.614472219 +0000 UTC m=+149.920348423" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.626974 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kgqfp" podStartSLOduration=124.62695322 podStartE2EDuration="2m4.62695322s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:31.625114968 +0000 UTC m=+149.930991182" watchObservedRunningTime="2026-01-31 09:03:31.62695322 +0000 UTC m=+149.932829424" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.657635 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.666646 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.672339 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.672492 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.172465861 +0000 UTC m=+150.478342075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.672741 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.673124 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.173112542 +0000 UTC m=+150.478988756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.674153 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.773633 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.773879 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.273841432 +0000 UTC m=+150.579717636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.774074 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.774539 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.274519065 +0000 UTC m=+150.580395269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.877301 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.878089 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.37806379 +0000 UTC m=+150.683939994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:31 crc kubenswrapper[4732]: I0131 09:03:31.979822 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:31 crc kubenswrapper[4732]: E0131 09:03:31.980466 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.480449706 +0000 UTC m=+150.786325910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.014309 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:32 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:32 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:32 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.014380 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.088540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.088945 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.588920416 +0000 UTC m=+150.894796620 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.197074 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.197873 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.697855371 +0000 UTC m=+151.003731585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.298334 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.298686 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.798638293 +0000 UTC m=+151.104514507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.298779 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.299204 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.799169441 +0000 UTC m=+151.105045645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.404220 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.404583 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:32.904561717 +0000 UTC m=+151.210437921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.452795 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d949875459e61dc3bb5371015bd0663088273aba212961ecb5389c7352f3c11e"} Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.466775 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c41d31339cf675ad3aff0cfdf91e7b7ab5f11341b830ded1e17edefbb99bb282"} Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.478471 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8c471013c02a116da2b4102b61c988c12b805b889021a45c3315cdb960278066"} Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.507479 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.507931 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.007916585 +0000 UTC m=+151.313792789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.508477 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-cx8cr" podStartSLOduration=125.508457314 podStartE2EDuration="2m5.508457314s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:32.508399182 +0000 UTC m=+150.814275376" watchObservedRunningTime="2026-01-31 09:03:32.508457314 +0000 UTC m=+150.814333518" Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.610641 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.612645 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.112615529 +0000 UTC m=+151.418491733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.714489 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.714897 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.21487584 +0000 UTC m=+151.520752044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.816247 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.816834 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.316812201 +0000 UTC m=+151.622688405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:32 crc kubenswrapper[4732]: I0131 09:03:32.917913 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:32 crc kubenswrapper[4732]: E0131 09:03:32.918279 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.418264365 +0000 UTC m=+151.724140569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.005088 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:33 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:33 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:33 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.005162 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.018867 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.019089 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.519056717 +0000 UTC m=+151.824932921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.019249 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.019648 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.519641386 +0000 UTC m=+151.825517590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.120484 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.120701 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.620655185 +0000 UTC m=+151.926531389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.120761 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.121180 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.621165392 +0000 UTC m=+151.927041596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.222106 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.222448 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.72242457 +0000 UTC m=+152.028300774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.278191 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-hzs92" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.323877 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.324409 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.824389262 +0000 UTC m=+152.130265466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.425242 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.425848 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:33.925809124 +0000 UTC m=+152.231685318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.478512 4732 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6fwnh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.478570 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" podUID="6a680afa-dc56-4bf8-808c-b1c947c8fbf0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.25:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.481377 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d320b4056ba1563d75c6d5c0e04bdd39555e92aeb617f54de39bbdd5d6bc1ef8"} Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.495125 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c7j22" event={"ID":"89cec875-cd1f-4867-8b4b-ca72c57c974b","Type":"ContainerStarted","Data":"ca100dc4a73e3df4d5093c77bd983ccb01bb25a5eaedf9c9761afdf3262b532b"} Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.497041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6206971b4a2adbef5050e4400a61cd57951cc177f5bac49cdaadc5bec2d4686c"} Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.497687 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.499811 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"42e8dee280c1630c0745f649fd90440f36f7653aa5fb40ffe4868220bf561421"} Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.529699 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rtg8l"] Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.531312 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.532830 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.533314 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.033296101 +0000 UTC m=+152.339172305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.537022 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.552445 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rtg8l"] Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.634250 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.634649 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-654wm\" (UniqueName: \"kubernetes.io/projected/320c2656-6f30-4922-835e-8c27a82800b1-kube-api-access-654wm\") pod \"community-operators-rtg8l\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.634711 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-catalog-content\") pod \"community-operators-rtg8l\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.634752 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-utilities\") pod \"community-operators-rtg8l\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.635714 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.135689537 +0000 UTC m=+152.441565741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.705783 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gb54f"] Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.707046 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.711327 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.719167 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gb54f"] Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.736430 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.736508 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-654wm\" (UniqueName: \"kubernetes.io/projected/320c2656-6f30-4922-835e-8c27a82800b1-kube-api-access-654wm\") pod \"community-operators-rtg8l\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.736542 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-catalog-content\") pod \"community-operators-rtg8l\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.736576 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-utilities\") pod \"community-operators-rtg8l\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.737086 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-utilities\") pod \"community-operators-rtg8l\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.737420 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.23740426 +0000 UTC m=+152.543280464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.738188 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-catalog-content\") pod \"community-operators-rtg8l\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.777750 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-654wm\" (UniqueName: \"kubernetes.io/projected/320c2656-6f30-4922-835e-8c27a82800b1-kube-api-access-654wm\") pod \"community-operators-rtg8l\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.837565 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.837848 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.337799098 +0000 UTC m=+152.643675302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.838108 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27rqc\" (UniqueName: \"kubernetes.io/projected/111ca852-fddd-4fb1-8d5d-331fd5921a71-kube-api-access-27rqc\") pod \"certified-operators-gb54f\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.838201 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-utilities\") pod \"certified-operators-gb54f\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.838278 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.838310 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-catalog-content\") pod \"certified-operators-gb54f\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.838647 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.338639867 +0000 UTC m=+152.644516071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.870923 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.920974 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zflvq"] Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.921964 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.933034 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zflvq"] Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.938761 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.939036 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-utilities\") pod \"certified-operators-gb54f\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.939104 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-catalog-content\") pod \"certified-operators-gb54f\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.939144 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27rqc\" (UniqueName: \"kubernetes.io/projected/111ca852-fddd-4fb1-8d5d-331fd5921a71-kube-api-access-27rqc\") pod \"certified-operators-gb54f\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: E0131 09:03:33.939555 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.439535442 +0000 UTC m=+152.745411646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.940208 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-utilities\") pod \"certified-operators-gb54f\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.940520 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-catalog-content\") pod \"certified-operators-gb54f\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.958429 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27rqc\" (UniqueName: \"kubernetes.io/projected/111ca852-fddd-4fb1-8d5d-331fd5921a71-kube-api-access-27rqc\") pod \"certified-operators-gb54f\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:33 crc kubenswrapper[4732]: I0131 09:03:33.990504 4732 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.001163 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:34 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:34 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:34 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.001249 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.022574 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.040562 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-catalog-content\") pod \"community-operators-zflvq\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.040629 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p95zf\" (UniqueName: \"kubernetes.io/projected/317b5076-0f62-45e5-9db0-8d03103c990e-kube-api-access-p95zf\") pod \"community-operators-zflvq\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.040657 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.040702 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-utilities\") pod \"community-operators-zflvq\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: E0131 09:03:34.041048 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.541032988 +0000 UTC m=+152.846909192 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.108164 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6z6vm"] Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.110044 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.127281 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt2jr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.127344 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-rt2jr" podUID="81e1781e-a935-4f3f-b2aa-9a0807f43c73" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.127536 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6z6vm"] Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.128210 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt2jr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.128233 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt2jr" podUID="81e1781e-a935-4f3f-b2aa-9a0807f43c73" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.152120 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.152301 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p95zf\" (UniqueName: \"kubernetes.io/projected/317b5076-0f62-45e5-9db0-8d03103c990e-kube-api-access-p95zf\") pod \"community-operators-zflvq\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.152368 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-utilities\") pod \"community-operators-zflvq\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.152459 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-catalog-content\") pod \"community-operators-zflvq\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.152954 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-catalog-content\") pod \"community-operators-zflvq\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: E0131 09:03:34.153042 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.653019427 +0000 UTC m=+152.958895621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.153558 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-utilities\") pod \"community-operators-zflvq\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.178774 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p95zf\" (UniqueName: \"kubernetes.io/projected/317b5076-0f62-45e5-9db0-8d03103c990e-kube-api-access-p95zf\") pod \"community-operators-zflvq\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.244481 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.273133 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2f4q\" (UniqueName: \"kubernetes.io/projected/7006b68f-caf9-44a9-a6df-26e7b594b931-kube-api-access-m2f4q\") pod \"certified-operators-6z6vm\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.273190 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-catalog-content\") pod \"certified-operators-6z6vm\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.273234 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.273312 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-utilities\") pod \"certified-operators-6z6vm\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: E0131 09:03:34.274008 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.773992277 +0000 UTC m=+153.079868481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.293936 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rtg8l"] Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.374434 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.374935 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:34 crc kubenswrapper[4732]: E0131 09:03:34.374965 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.874926554 +0000 UTC m=+153.180802758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.375160 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.375222 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-utilities\") pod \"certified-operators-6z6vm\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.375424 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2f4q\" (UniqueName: \"kubernetes.io/projected/7006b68f-caf9-44a9-a6df-26e7b594b931-kube-api-access-m2f4q\") pod \"certified-operators-6z6vm\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.375443 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.375467 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-catalog-content\") pod \"certified-operators-6z6vm\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: E0131 09:03:34.376360 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.876342602 +0000 UTC m=+153.182218886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.376396 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-utilities\") pod \"certified-operators-6z6vm\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.376555 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-catalog-content\") pod \"certified-operators-6z6vm\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.397555 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-76d6v" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.399585 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2f4q\" (UniqueName: \"kubernetes.io/projected/7006b68f-caf9-44a9-a6df-26e7b594b931-kube-api-access-m2f4q\") pod \"certified-operators-6z6vm\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.457990 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.465855 4732 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xprfh container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]log ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]etcd ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/max-in-flight-filter ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 31 09:03:34 crc kubenswrapper[4732]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 31 09:03:34 crc kubenswrapper[4732]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectcache ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startinformers ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 31 09:03:34 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 31 09:03:34 crc kubenswrapper[4732]: livez check failed Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.465932 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" podUID="81b523ca-b564-45d4-bad5-f7e236f2e6d0" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.476849 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:34 crc kubenswrapper[4732]: E0131 09:03:34.478178 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:34.978155388 +0000 UTC m=+153.284031602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.513964 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtg8l" event={"ID":"320c2656-6f30-4922-835e-8c27a82800b1","Type":"ContainerStarted","Data":"9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670"} Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.514036 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtg8l" event={"ID":"320c2656-6f30-4922-835e-8c27a82800b1","Type":"ContainerStarted","Data":"7ba9bda66cde21334a1fb904223442bb2f107c035dd197b8f6160f7ac322e79d"} Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.518673 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.518718 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.522599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c7j22" event={"ID":"89cec875-cd1f-4867-8b4b-ca72c57c974b","Type":"ContainerStarted","Data":"352c6c95571c5d3cf1574f5d988513fb0ff48b49c2005e76ade9b33d9b508cf0"} Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.522641 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c7j22" event={"ID":"89cec875-cd1f-4867-8b4b-ca72c57c974b","Type":"ContainerStarted","Data":"a89fca39fecdd4183eef1ef80c7dc34f7605dbd4118ccd4f532fe86d73f32a03"} Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.522656 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-c7j22" event={"ID":"89cec875-cd1f-4867-8b4b-ca72c57c974b","Type":"ContainerStarted","Data":"afd80b2706ba2e95e643c506800c8a0b7a4cc44f331f16f3cc71423f03107efc"} Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.556799 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.588274 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:34 crc kubenswrapper[4732]: E0131 09:03:34.588583 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:35.088562533 +0000 UTC m=+153.394438737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.600585 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zflvq"] Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.633929 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gb54f"] Jan 31 09:03:34 crc kubenswrapper[4732]: W0131 09:03:34.660445 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod317b5076_0f62_45e5_9db0_8d03103c990e.slice/crio-e3e3c2c80d6bcc6eb2c4ab51462ea9a6f6a150eb54a71c100214dd3061f85429 WatchSource:0}: Error finding container e3e3c2c80d6bcc6eb2c4ab51462ea9a6f6a150eb54a71c100214dd3061f85429: Status 404 returned error can't find the container with id e3e3c2c80d6bcc6eb2c4ab51462ea9a6f6a150eb54a71c100214dd3061f85429 Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.690573 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:34 crc kubenswrapper[4732]: E0131 09:03:34.690841 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 09:03:35.190803314 +0000 UTC m=+153.496679518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.691179 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:34 crc kubenswrapper[4732]: E0131 09:03:34.691535 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 09:03:35.191520928 +0000 UTC m=+153.497397132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-99dtb" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.739509 4732 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-31T09:03:33.990545659Z","Handler":null,"Name":""} Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.748538 4732 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.748578 4732 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.794371 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.799545 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.895844 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.899978 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6z6vm"] Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.905947 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.906006 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.930639 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-99dtb\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:34 crc kubenswrapper[4732]: W0131 09:03:34.946623 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7006b68f_caf9_44a9_a6df_26e7b594b931.slice/crio-57ca1e232ee4486e6e4d54b1df45d6e963d8c6e1b497c54651b39471c8475a9c WatchSource:0}: Error finding container 57ca1e232ee4486e6e4d54b1df45d6e963d8c6e1b497c54651b39471c8475a9c: Status 404 returned error can't find the container with id 57ca1e232ee4486e6e4d54b1df45d6e963d8c6e1b497c54651b39471c8475a9c Jan 31 09:03:34 crc kubenswrapper[4732]: I0131 09:03:34.951320 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.002892 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:35 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:35 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:35 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.002984 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.227065 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-99dtb"] Jan 31 09:03:35 crc kubenswrapper[4732]: W0131 09:03:35.261771 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac602fa_14af_4ae0_a538_d73e938db036.slice/crio-039e3167eddd030475a90d04176c40d3799eaa481a42473d70722bc67a78215e WatchSource:0}: Error finding container 039e3167eddd030475a90d04176c40d3799eaa481a42473d70722bc67a78215e: Status 404 returned error can't find the container with id 039e3167eddd030475a90d04176c40d3799eaa481a42473d70722bc67a78215e Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.285050 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.285824 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.287926 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.288451 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.298695 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.345138 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.346302 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.347589 4732 patch_prober.go:28] interesting pod/console-f9d7485db-8t8ks container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.347631 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8t8ks" podUID="b35d0df8-53f0-4787-b0b4-c93be28f0127" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.403890 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.403944 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.505644 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.505724 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.505850 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.506824 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d7ngt"] Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.508095 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.510064 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.529263 4732 generic.go:334] "Generic (PLEG): container finished" podID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerID="b38eca76de0c7fcb760be9dbb97b202ed5f6069cdb395b15ae3017074d198d5e" exitCode=0 Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.529391 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z6vm" event={"ID":"7006b68f-caf9-44a9-a6df-26e7b594b931","Type":"ContainerDied","Data":"b38eca76de0c7fcb760be9dbb97b202ed5f6069cdb395b15ae3017074d198d5e"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.529431 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z6vm" event={"ID":"7006b68f-caf9-44a9-a6df-26e7b594b931","Type":"ContainerStarted","Data":"57ca1e232ee4486e6e4d54b1df45d6e963d8c6e1b497c54651b39471c8475a9c"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.530297 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7ngt"] Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.531639 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.532844 4732 generic.go:334] "Generic (PLEG): container finished" podID="320c2656-6f30-4922-835e-8c27a82800b1" containerID="9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670" exitCode=0 Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.532915 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtg8l" event={"ID":"320c2656-6f30-4922-835e-8c27a82800b1","Type":"ContainerDied","Data":"9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.534572 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.537730 4732 generic.go:334] "Generic (PLEG): container finished" podID="317b5076-0f62-45e5-9db0-8d03103c990e" containerID="d955c74da8ffd285d20400dc34dfd51736c439bd9e7f63e99e5270665cbbadb8" exitCode=0 Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.537823 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zflvq" event={"ID":"317b5076-0f62-45e5-9db0-8d03103c990e","Type":"ContainerDied","Data":"d955c74da8ffd285d20400dc34dfd51736c439bd9e7f63e99e5270665cbbadb8"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.537865 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zflvq" event={"ID":"317b5076-0f62-45e5-9db0-8d03103c990e","Type":"ContainerStarted","Data":"e3e3c2c80d6bcc6eb2c4ab51462ea9a6f6a150eb54a71c100214dd3061f85429"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.539294 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" event={"ID":"4ac602fa-14af-4ae0-a538-d73e938db036","Type":"ContainerStarted","Data":"76229ced9d7ea551eb476a7db3e9648ede271e5ea762d59f5c012cdd16284033"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.539321 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" event={"ID":"4ac602fa-14af-4ae0-a538-d73e938db036","Type":"ContainerStarted","Data":"039e3167eddd030475a90d04176c40d3799eaa481a42473d70722bc67a78215e"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.540699 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.550540 4732 generic.go:334] "Generic (PLEG): container finished" podID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerID="b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb" exitCode=0 Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.551785 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb54f" event={"ID":"111ca852-fddd-4fb1-8d5d-331fd5921a71","Type":"ContainerDied","Data":"b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.551867 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb54f" event={"ID":"111ca852-fddd-4fb1-8d5d-331fd5921a71","Type":"ContainerStarted","Data":"2852e87c3d90e55a145bff8290e54ac7389fb8f75ebcdc35c688bd52463d5985"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.553991 4732 generic.go:334] "Generic (PLEG): container finished" podID="0558933a-c8d6-45dc-aeaf-af86190b15a0" containerID="940d163c1a1a494d9850589a935618158e47c219b1ef2186264ecbca1a2bfccc" exitCode=0 Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.554838 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" event={"ID":"0558933a-c8d6-45dc-aeaf-af86190b15a0","Type":"ContainerDied","Data":"940d163c1a1a494d9850589a935618158e47c219b1ef2186264ecbca1a2bfccc"} Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.561012 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-85mvk" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.612026 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88rk8\" (UniqueName: \"kubernetes.io/projected/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-kube-api-access-88rk8\") pod \"redhat-marketplace-d7ngt\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.612803 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-catalog-content\") pod \"redhat-marketplace-d7ngt\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.613019 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-utilities\") pod \"redhat-marketplace-d7ngt\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.622611 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" podStartSLOduration=128.6225891 podStartE2EDuration="2m8.6225891s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:35.614828299 +0000 UTC m=+153.920704513" watchObservedRunningTime="2026-01-31 09:03:35.6225891 +0000 UTC m=+153.928465304" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.642071 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.714166 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88rk8\" (UniqueName: \"kubernetes.io/projected/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-kube-api-access-88rk8\") pod \"redhat-marketplace-d7ngt\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.714245 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-catalog-content\") pod \"redhat-marketplace-d7ngt\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.714301 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-utilities\") pod \"redhat-marketplace-d7ngt\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.715011 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-utilities\") pod \"redhat-marketplace-d7ngt\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.715654 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-catalog-content\") pod \"redhat-marketplace-d7ngt\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.750822 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88rk8\" (UniqueName: \"kubernetes.io/projected/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-kube-api-access-88rk8\") pod \"redhat-marketplace-d7ngt\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.785146 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-c7j22" podStartSLOduration=13.78512754 podStartE2EDuration="13.78512754s" podCreationTimestamp="2026-01-31 09:03:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:35.780304897 +0000 UTC m=+154.086181101" watchObservedRunningTime="2026-01-31 09:03:35.78512754 +0000 UTC m=+154.091003734" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.822224 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.929816 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2jzzz"] Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.930912 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:35 crc kubenswrapper[4732]: I0131 09:03:35.951083 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jzzz"] Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:35.999926 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.010198 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:36 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:36 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:36 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.010284 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.022999 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-utilities\") pod \"redhat-marketplace-2jzzz\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.023112 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-catalog-content\") pod \"redhat-marketplace-2jzzz\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.023134 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm96d\" (UniqueName: \"kubernetes.io/projected/60fab354-a742-4d49-88d9-22843a857ea5-kube-api-access-nm96d\") pod \"redhat-marketplace-2jzzz\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.023343 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7fqsl" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.081918 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wb8wq" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.097201 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.140720 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-utilities\") pod \"redhat-marketplace-2jzzz\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.140831 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-catalog-content\") pod \"redhat-marketplace-2jzzz\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.140871 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm96d\" (UniqueName: \"kubernetes.io/projected/60fab354-a742-4d49-88d9-22843a857ea5-kube-api-access-nm96d\") pod \"redhat-marketplace-2jzzz\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.142957 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-utilities\") pod \"redhat-marketplace-2jzzz\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.145933 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-catalog-content\") pod \"redhat-marketplace-2jzzz\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.190440 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm96d\" (UniqueName: \"kubernetes.io/projected/60fab354-a742-4d49-88d9-22843a857ea5-kube-api-access-nm96d\") pod \"redhat-marketplace-2jzzz\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.260465 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.273549 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.359788 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6fwnh" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.556556 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.557106 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7ngt"] Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.565057 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8f6e5b3b-035c-42f1-a6e9-cc4614504712","Type":"ContainerStarted","Data":"3a6201eb755c1ceee6a6ee3a12601a491d4c76b48fb722418a24e1228f4d3bd9"} Jan 31 09:03:36 crc kubenswrapper[4732]: W0131 09:03:36.574027 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d6ffd83_fb99_48e0_a34a_fd365f971ef1.slice/crio-628f0999201f984caa436318589dd8628cafb6a41db0673c62163c5cc780a5ff WatchSource:0}: Error finding container 628f0999201f984caa436318589dd8628cafb6a41db0673c62163c5cc780a5ff: Status 404 returned error can't find the container with id 628f0999201f984caa436318589dd8628cafb6a41db0673c62163c5cc780a5ff Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.574393 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jzzz"] Jan 31 09:03:36 crc kubenswrapper[4732]: W0131 09:03:36.581043 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60fab354_a742_4d49_88d9_22843a857ea5.slice/crio-9ea503aa2d078ea5c77d132dad08c175b5d8c3c762d663e871a3d782440f1d6e WatchSource:0}: Error finding container 9ea503aa2d078ea5c77d132dad08c175b5d8c3c762d663e871a3d782440f1d6e: Status 404 returned error can't find the container with id 9ea503aa2d078ea5c77d132dad08c175b5d8c3c762d663e871a3d782440f1d6e Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.706303 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vdcdv"] Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.707634 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.710026 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.723328 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vdcdv"] Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.748903 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dzc9\" (UniqueName: \"kubernetes.io/projected/b03cae03-72c1-4b13-8031-33381e6df48a-kube-api-access-2dzc9\") pod \"redhat-operators-vdcdv\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.749108 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-utilities\") pod \"redhat-operators-vdcdv\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.749168 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-catalog-content\") pod \"redhat-operators-vdcdv\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.751408 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.849785 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0558933a-c8d6-45dc-aeaf-af86190b15a0-secret-volume\") pod \"0558933a-c8d6-45dc-aeaf-af86190b15a0\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.849897 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0558933a-c8d6-45dc-aeaf-af86190b15a0-config-volume\") pod \"0558933a-c8d6-45dc-aeaf-af86190b15a0\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.849949 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txg4c\" (UniqueName: \"kubernetes.io/projected/0558933a-c8d6-45dc-aeaf-af86190b15a0-kube-api-access-txg4c\") pod \"0558933a-c8d6-45dc-aeaf-af86190b15a0\" (UID: \"0558933a-c8d6-45dc-aeaf-af86190b15a0\") " Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.850176 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dzc9\" (UniqueName: \"kubernetes.io/projected/b03cae03-72c1-4b13-8031-33381e6df48a-kube-api-access-2dzc9\") pod \"redhat-operators-vdcdv\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.850240 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-utilities\") pod \"redhat-operators-vdcdv\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.850266 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-catalog-content\") pod \"redhat-operators-vdcdv\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.850659 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0558933a-c8d6-45dc-aeaf-af86190b15a0-config-volume" (OuterVolumeSpecName: "config-volume") pod "0558933a-c8d6-45dc-aeaf-af86190b15a0" (UID: "0558933a-c8d6-45dc-aeaf-af86190b15a0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.851060 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-catalog-content\") pod \"redhat-operators-vdcdv\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.851103 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-utilities\") pod \"redhat-operators-vdcdv\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.856757 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0558933a-c8d6-45dc-aeaf-af86190b15a0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0558933a-c8d6-45dc-aeaf-af86190b15a0" (UID: "0558933a-c8d6-45dc-aeaf-af86190b15a0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.857120 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0558933a-c8d6-45dc-aeaf-af86190b15a0-kube-api-access-txg4c" (OuterVolumeSpecName: "kube-api-access-txg4c") pod "0558933a-c8d6-45dc-aeaf-af86190b15a0" (UID: "0558933a-c8d6-45dc-aeaf-af86190b15a0"). InnerVolumeSpecName "kube-api-access-txg4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.871823 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dzc9\" (UniqueName: \"kubernetes.io/projected/b03cae03-72c1-4b13-8031-33381e6df48a-kube-api-access-2dzc9\") pod \"redhat-operators-vdcdv\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.951838 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txg4c\" (UniqueName: \"kubernetes.io/projected/0558933a-c8d6-45dc-aeaf-af86190b15a0-kube-api-access-txg4c\") on node \"crc\" DevicePath \"\"" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.951908 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0558933a-c8d6-45dc-aeaf-af86190b15a0-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:03:36 crc kubenswrapper[4732]: I0131 09:03:36.951918 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0558933a-c8d6-45dc-aeaf-af86190b15a0-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.002146 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:37 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:37 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:37 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.002231 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.029355 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.117011 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zm4tc"] Jan 31 09:03:37 crc kubenswrapper[4732]: E0131 09:03:37.117266 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0558933a-c8d6-45dc-aeaf-af86190b15a0" containerName="collect-profiles" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.117280 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0558933a-c8d6-45dc-aeaf-af86190b15a0" containerName="collect-profiles" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.117425 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0558933a-c8d6-45dc-aeaf-af86190b15a0" containerName="collect-profiles" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.118357 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.131732 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zm4tc"] Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.154018 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-utilities\") pod \"redhat-operators-zm4tc\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.154099 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzkj6\" (UniqueName: \"kubernetes.io/projected/21393f97-49f1-4f27-a24c-93f88fe6596b-kube-api-access-rzkj6\") pod \"redhat-operators-zm4tc\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.154187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-catalog-content\") pod \"redhat-operators-zm4tc\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.238407 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vdcdv"] Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.255268 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzkj6\" (UniqueName: \"kubernetes.io/projected/21393f97-49f1-4f27-a24c-93f88fe6596b-kube-api-access-rzkj6\") pod \"redhat-operators-zm4tc\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.255344 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-catalog-content\") pod \"redhat-operators-zm4tc\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.255413 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-utilities\") pod \"redhat-operators-zm4tc\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: W0131 09:03:37.255694 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03cae03_72c1_4b13_8031_33381e6df48a.slice/crio-b2573936f6c20f59b9e6d6e1b605a68abfd082c81e607c5222d95cd6ee9a3427 WatchSource:0}: Error finding container b2573936f6c20f59b9e6d6e1b605a68abfd082c81e607c5222d95cd6ee9a3427: Status 404 returned error can't find the container with id b2573936f6c20f59b9e6d6e1b605a68abfd082c81e607c5222d95cd6ee9a3427 Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.256011 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-utilities\") pod \"redhat-operators-zm4tc\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.256082 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-catalog-content\") pod \"redhat-operators-zm4tc\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.277165 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzkj6\" (UniqueName: \"kubernetes.io/projected/21393f97-49f1-4f27-a24c-93f88fe6596b-kube-api-access-rzkj6\") pod \"redhat-operators-zm4tc\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.438121 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.574106 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.574105 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497500-p2gn7" event={"ID":"0558933a-c8d6-45dc-aeaf-af86190b15a0","Type":"ContainerDied","Data":"489c0c3b171c6d96e855fb1f2464c3d1486c21baf174589add9247f92fff3bcc"} Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.574628 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="489c0c3b171c6d96e855fb1f2464c3d1486c21baf174589add9247f92fff3bcc" Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.575748 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdcdv" event={"ID":"b03cae03-72c1-4b13-8031-33381e6df48a","Type":"ContainerStarted","Data":"b2573936f6c20f59b9e6d6e1b605a68abfd082c81e607c5222d95cd6ee9a3427"} Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.577537 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jzzz" event={"ID":"60fab354-a742-4d49-88d9-22843a857ea5","Type":"ContainerStarted","Data":"5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3"} Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.577581 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jzzz" event={"ID":"60fab354-a742-4d49-88d9-22843a857ea5","Type":"ContainerStarted","Data":"9ea503aa2d078ea5c77d132dad08c175b5d8c3c762d663e871a3d782440f1d6e"} Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.579162 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8f6e5b3b-035c-42f1-a6e9-cc4614504712","Type":"ContainerStarted","Data":"a362685bd59e615a1f2234cfabf15292c569459eb6135ba459f367652836aa93"} Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.584794 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7ngt" event={"ID":"3d6ffd83-fb99-48e0-a34a-fd365f971ef1","Type":"ContainerStarted","Data":"628f0999201f984caa436318589dd8628cafb6a41db0673c62163c5cc780a5ff"} Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.641469 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zm4tc"] Jan 31 09:03:37 crc kubenswrapper[4732]: W0131 09:03:37.699249 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21393f97_49f1_4f27_a24c_93f88fe6596b.slice/crio-72ec2ce1dae672abaa2d0accf204fcab615644a223c21af740bfdd14a5e3a998 WatchSource:0}: Error finding container 72ec2ce1dae672abaa2d0accf204fcab615644a223c21af740bfdd14a5e3a998: Status 404 returned error can't find the container with id 72ec2ce1dae672abaa2d0accf204fcab615644a223c21af740bfdd14a5e3a998 Jan 31 09:03:37 crc kubenswrapper[4732]: I0131 09:03:37.855319 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f78bs" Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.009912 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:38 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:38 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:38 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.009988 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.593173 4732 generic.go:334] "Generic (PLEG): container finished" podID="b03cae03-72c1-4b13-8031-33381e6df48a" containerID="4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1" exitCode=0 Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.593937 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdcdv" event={"ID":"b03cae03-72c1-4b13-8031-33381e6df48a","Type":"ContainerDied","Data":"4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1"} Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.596790 4732 generic.go:334] "Generic (PLEG): container finished" podID="60fab354-a742-4d49-88d9-22843a857ea5" containerID="5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3" exitCode=0 Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.597370 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jzzz" event={"ID":"60fab354-a742-4d49-88d9-22843a857ea5","Type":"ContainerDied","Data":"5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3"} Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.600437 4732 generic.go:334] "Generic (PLEG): container finished" podID="8f6e5b3b-035c-42f1-a6e9-cc4614504712" containerID="a362685bd59e615a1f2234cfabf15292c569459eb6135ba459f367652836aa93" exitCode=0 Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.600564 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8f6e5b3b-035c-42f1-a6e9-cc4614504712","Type":"ContainerDied","Data":"a362685bd59e615a1f2234cfabf15292c569459eb6135ba459f367652836aa93"} Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.603895 4732 generic.go:334] "Generic (PLEG): container finished" podID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerID="b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867" exitCode=0 Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.604056 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm4tc" event={"ID":"21393f97-49f1-4f27-a24c-93f88fe6596b","Type":"ContainerDied","Data":"b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867"} Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.604213 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm4tc" event={"ID":"21393f97-49f1-4f27-a24c-93f88fe6596b","Type":"ContainerStarted","Data":"72ec2ce1dae672abaa2d0accf204fcab615644a223c21af740bfdd14a5e3a998"} Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.619585 4732 generic.go:334] "Generic (PLEG): container finished" podID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerID="c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1" exitCode=0 Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.619627 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7ngt" event={"ID":"3d6ffd83-fb99-48e0-a34a-fd365f971ef1","Type":"ContainerDied","Data":"c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1"} Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.973899 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.974910 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.978805 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.979083 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 09:03:38 crc kubenswrapper[4732]: I0131 09:03:38.999843 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.002585 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:39 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:39 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:39 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.002677 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.088868 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0285bd11-6fe5-4206-9242-d008dde146bf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0285bd11-6fe5-4206-9242-d008dde146bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.088951 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0285bd11-6fe5-4206-9242-d008dde146bf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0285bd11-6fe5-4206-9242-d008dde146bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.189682 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0285bd11-6fe5-4206-9242-d008dde146bf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0285bd11-6fe5-4206-9242-d008dde146bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.189776 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0285bd11-6fe5-4206-9242-d008dde146bf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0285bd11-6fe5-4206-9242-d008dde146bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.189923 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0285bd11-6fe5-4206-9242-d008dde146bf-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0285bd11-6fe5-4206-9242-d008dde146bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.212303 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0285bd11-6fe5-4206-9242-d008dde146bf-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0285bd11-6fe5-4206-9242-d008dde146bf\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.300795 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.382653 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.398921 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xprfh" Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.673441 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 09:03:39 crc kubenswrapper[4732]: W0131 09:03:39.748817 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0285bd11_6fe5_4206_9242_d008dde146bf.slice/crio-5415799b82fcadfba2c54e03df6a02db24ca92c8ec2cbeeab74649ebabe5394f WatchSource:0}: Error finding container 5415799b82fcadfba2c54e03df6a02db24ca92c8ec2cbeeab74649ebabe5394f: Status 404 returned error can't find the container with id 5415799b82fcadfba2c54e03df6a02db24ca92c8ec2cbeeab74649ebabe5394f Jan 31 09:03:39 crc kubenswrapper[4732]: I0131 09:03:39.981503 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.009968 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:40 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:40 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:40 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.010060 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.015782 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kube-api-access\") pod \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\" (UID: \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\") " Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.015991 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kubelet-dir\") pod \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\" (UID: \"8f6e5b3b-035c-42f1-a6e9-cc4614504712\") " Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.016471 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8f6e5b3b-035c-42f1-a6e9-cc4614504712" (UID: "8f6e5b3b-035c-42f1-a6e9-cc4614504712"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.017430 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.024393 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8f6e5b3b-035c-42f1-a6e9-cc4614504712" (UID: "8f6e5b3b-035c-42f1-a6e9-cc4614504712"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.118374 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f6e5b3b-035c-42f1-a6e9-cc4614504712-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.689180 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0285bd11-6fe5-4206-9242-d008dde146bf","Type":"ContainerStarted","Data":"639c64b6ae585a9ef8feca7c751071f95968131e8f1617d3627410b4bf02ba7f"} Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.689834 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0285bd11-6fe5-4206-9242-d008dde146bf","Type":"ContainerStarted","Data":"5415799b82fcadfba2c54e03df6a02db24ca92c8ec2cbeeab74649ebabe5394f"} Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.698337 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"8f6e5b3b-035c-42f1-a6e9-cc4614504712","Type":"ContainerDied","Data":"3a6201eb755c1ceee6a6ee3a12601a491d4c76b48fb722418a24e1228f4d3bd9"} Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.698400 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a6201eb755c1ceee6a6ee3a12601a491d4c76b48fb722418a24e1228f4d3bd9" Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.698412 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 09:03:40 crc kubenswrapper[4732]: I0131 09:03:40.704993 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.704957181 podStartE2EDuration="2.704957181s" podCreationTimestamp="2026-01-31 09:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:40.701103392 +0000 UTC m=+159.006979596" watchObservedRunningTime="2026-01-31 09:03:40.704957181 +0000 UTC m=+159.010833385" Jan 31 09:03:41 crc kubenswrapper[4732]: I0131 09:03:41.000953 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:41 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:41 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:41 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:41 crc kubenswrapper[4732]: I0131 09:03:41.001047 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:41 crc kubenswrapper[4732]: I0131 09:03:41.733817 4732 generic.go:334] "Generic (PLEG): container finished" podID="0285bd11-6fe5-4206-9242-d008dde146bf" containerID="639c64b6ae585a9ef8feca7c751071f95968131e8f1617d3627410b4bf02ba7f" exitCode=0 Jan 31 09:03:41 crc kubenswrapper[4732]: I0131 09:03:41.733993 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0285bd11-6fe5-4206-9242-d008dde146bf","Type":"ContainerDied","Data":"639c64b6ae585a9ef8feca7c751071f95968131e8f1617d3627410b4bf02ba7f"} Jan 31 09:03:42 crc kubenswrapper[4732]: I0131 09:03:42.002808 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:42 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:42 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:42 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:42 crc kubenswrapper[4732]: I0131 09:03:42.002897 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:43 crc kubenswrapper[4732]: I0131 09:03:42.999673 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:43 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:43 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:43 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:43 crc kubenswrapper[4732]: I0131 09:03:43.000143 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:44 crc kubenswrapper[4732]: I0131 09:03:44.027918 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:44 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:44 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:44 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:44 crc kubenswrapper[4732]: I0131 09:03:44.028283 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:44 crc kubenswrapper[4732]: I0131 09:03:44.127013 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt2jr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 31 09:03:44 crc kubenswrapper[4732]: I0131 09:03:44.127068 4732 patch_prober.go:28] interesting pod/downloads-7954f5f757-rt2jr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Jan 31 09:03:44 crc kubenswrapper[4732]: I0131 09:03:44.127092 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-rt2jr" podUID="81e1781e-a935-4f3f-b2aa-9a0807f43c73" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 31 09:03:44 crc kubenswrapper[4732]: I0131 09:03:44.127131 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rt2jr" podUID="81e1781e-a935-4f3f-b2aa-9a0807f43c73" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.28:8080/\": dial tcp 10.217.0.28:8080: connect: connection refused" Jan 31 09:03:45 crc kubenswrapper[4732]: I0131 09:03:44.999967 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:45 crc kubenswrapper[4732]: [-]has-synced failed: reason withheld Jan 31 09:03:45 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:45 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:45 crc kubenswrapper[4732]: I0131 09:03:45.000038 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:45 crc kubenswrapper[4732]: I0131 09:03:45.345153 4732 patch_prober.go:28] interesting pod/console-f9d7485db-8t8ks container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Jan 31 09:03:45 crc kubenswrapper[4732]: I0131 09:03:45.345227 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8t8ks" podUID="b35d0df8-53f0-4787-b0b4-c93be28f0127" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Jan 31 09:03:45 crc kubenswrapper[4732]: I0131 09:03:45.999434 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:45 crc kubenswrapper[4732]: [+]has-synced ok Jan 31 09:03:45 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:45 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:45 crc kubenswrapper[4732]: I0131 09:03:45.999508 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:47 crc kubenswrapper[4732]: I0131 09:03:47.002600 4732 patch_prober.go:28] interesting pod/router-default-5444994796-h5q9f container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 09:03:47 crc kubenswrapper[4732]: [+]has-synced ok Jan 31 09:03:47 crc kubenswrapper[4732]: [+]process-running ok Jan 31 09:03:47 crc kubenswrapper[4732]: healthz check failed Jan 31 09:03:47 crc kubenswrapper[4732]: I0131 09:03:47.003060 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h5q9f" podUID="8814e7c8-5104-40f7-9761-4feedc15697b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:03:47 crc kubenswrapper[4732]: I0131 09:03:47.498216 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:03:47 crc kubenswrapper[4732]: I0131 09:03:47.498311 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:03:47 crc kubenswrapper[4732]: I0131 09:03:47.999712 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:48 crc kubenswrapper[4732]: I0131 09:03:48.003538 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-h5q9f" Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.790432 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.810797 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.816045 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0285bd11-6fe5-4206-9242-d008dde146bf","Type":"ContainerDied","Data":"5415799b82fcadfba2c54e03df6a02db24ca92c8ec2cbeeab74649ebabe5394f"} Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.816088 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5415799b82fcadfba2c54e03df6a02db24ca92c8ec2cbeeab74649ebabe5394f" Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.816161 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.867702 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bd29a31-1a47-40da-afc5-6c4423067083-metrics-certs\") pod \"network-metrics-daemon-7fgvm\" (UID: \"3bd29a31-1a47-40da-afc5-6c4423067083\") " pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.912425 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0285bd11-6fe5-4206-9242-d008dde146bf-kubelet-dir\") pod \"0285bd11-6fe5-4206-9242-d008dde146bf\" (UID: \"0285bd11-6fe5-4206-9242-d008dde146bf\") " Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.912486 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0285bd11-6fe5-4206-9242-d008dde146bf-kube-api-access\") pod \"0285bd11-6fe5-4206-9242-d008dde146bf\" (UID: \"0285bd11-6fe5-4206-9242-d008dde146bf\") " Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.912918 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0285bd11-6fe5-4206-9242-d008dde146bf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0285bd11-6fe5-4206-9242-d008dde146bf" (UID: "0285bd11-6fe5-4206-9242-d008dde146bf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:03:49 crc kubenswrapper[4732]: I0131 09:03:49.916364 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0285bd11-6fe5-4206-9242-d008dde146bf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0285bd11-6fe5-4206-9242-d008dde146bf" (UID: "0285bd11-6fe5-4206-9242-d008dde146bf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:03:50 crc kubenswrapper[4732]: I0131 09:03:50.014597 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0285bd11-6fe5-4206-9242-d008dde146bf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:03:50 crc kubenswrapper[4732]: I0131 09:03:50.014642 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0285bd11-6fe5-4206-9242-d008dde146bf-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:03:50 crc kubenswrapper[4732]: I0131 09:03:50.078345 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7fgvm" Jan 31 09:03:52 crc kubenswrapper[4732]: I0131 09:03:52.173601 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7fgvm"] Jan 31 09:03:52 crc kubenswrapper[4732]: W0131 09:03:52.177080 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bd29a31_1a47_40da_afc5_6c4423067083.slice/crio-ee556c74eedfac86d916cb65b34af21b0afa537527a3a4c1560d3aa3abefeb98 WatchSource:0}: Error finding container ee556c74eedfac86d916cb65b34af21b0afa537527a3a4c1560d3aa3abefeb98: Status 404 returned error can't find the container with id ee556c74eedfac86d916cb65b34af21b0afa537527a3a4c1560d3aa3abefeb98 Jan 31 09:03:52 crc kubenswrapper[4732]: I0131 09:03:52.833931 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" event={"ID":"3bd29a31-1a47-40da-afc5-6c4423067083","Type":"ContainerStarted","Data":"f51fea5aa94fc11ca8c52095f900fac0afbde7512c0605479a4ad661cd093d61"} Jan 31 09:03:52 crc kubenswrapper[4732]: I0131 09:03:52.834440 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" event={"ID":"3bd29a31-1a47-40da-afc5-6c4423067083","Type":"ContainerStarted","Data":"ee556c74eedfac86d916cb65b34af21b0afa537527a3a4c1560d3aa3abefeb98"} Jan 31 09:03:53 crc kubenswrapper[4732]: I0131 09:03:53.842248 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7fgvm" event={"ID":"3bd29a31-1a47-40da-afc5-6c4423067083","Type":"ContainerStarted","Data":"0a32f92f4f8121eeadd59580865703e262199790d72ed17d2599e76cb35dc3b8"} Jan 31 09:03:53 crc kubenswrapper[4732]: I0131 09:03:53.862459 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7fgvm" podStartSLOduration=146.862432866 podStartE2EDuration="2m26.862432866s" podCreationTimestamp="2026-01-31 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:03:53.859572549 +0000 UTC m=+172.165448773" watchObservedRunningTime="2026-01-31 09:03:53.862432866 +0000 UTC m=+172.168309070" Jan 31 09:03:54 crc kubenswrapper[4732]: I0131 09:03:54.143974 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-rt2jr" Jan 31 09:03:54 crc kubenswrapper[4732]: I0131 09:03:54.957334 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:03:55 crc kubenswrapper[4732]: I0131 09:03:55.349092 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:03:55 crc kubenswrapper[4732]: I0131 09:03:55.352747 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8t8ks" Jan 31 09:04:06 crc kubenswrapper[4732]: I0131 09:04:06.078003 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wq7bg" Jan 31 09:04:11 crc kubenswrapper[4732]: I0131 09:04:11.672505 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.371438 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 09:04:14 crc kubenswrapper[4732]: E0131 09:04:14.372999 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0285bd11-6fe5-4206-9242-d008dde146bf" containerName="pruner" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.373145 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0285bd11-6fe5-4206-9242-d008dde146bf" containerName="pruner" Jan 31 09:04:14 crc kubenswrapper[4732]: E0131 09:04:14.373342 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6e5b3b-035c-42f1-a6e9-cc4614504712" containerName="pruner" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.373431 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6e5b3b-035c-42f1-a6e9-cc4614504712" containerName="pruner" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.373638 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0285bd11-6fe5-4206-9242-d008dde146bf" containerName="pruner" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.373769 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6e5b3b-035c-42f1-a6e9-cc4614504712" containerName="pruner" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.374405 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.376657 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.379512 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.385764 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.487513 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7497adea-5f95-433f-b644-9aa3eae85937-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7497adea-5f95-433f-b644-9aa3eae85937\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.487619 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7497adea-5f95-433f-b644-9aa3eae85937-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7497adea-5f95-433f-b644-9aa3eae85937\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.588696 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7497adea-5f95-433f-b644-9aa3eae85937-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7497adea-5f95-433f-b644-9aa3eae85937\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.588807 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7497adea-5f95-433f-b644-9aa3eae85937-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7497adea-5f95-433f-b644-9aa3eae85937\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.588846 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7497adea-5f95-433f-b644-9aa3eae85937-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7497adea-5f95-433f-b644-9aa3eae85937\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.611960 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7497adea-5f95-433f-b644-9aa3eae85937-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7497adea-5f95-433f-b644-9aa3eae85937\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:14 crc kubenswrapper[4732]: I0131 09:04:14.706206 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:16 crc kubenswrapper[4732]: E0131 09:04:16.640946 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 31 09:04:16 crc kubenswrapper[4732]: E0131 09:04:16.641831 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rzkj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zm4tc_openshift-marketplace(21393f97-49f1-4f27-a24c-93f88fe6596b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:04:16 crc kubenswrapper[4732]: E0131 09:04:16.643094 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zm4tc" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" Jan 31 09:04:17 crc kubenswrapper[4732]: I0131 09:04:17.497558 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:04:17 crc kubenswrapper[4732]: I0131 09:04:17.497746 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.369340 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.370347 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.380545 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 09:04:18 crc kubenswrapper[4732]: E0131 09:04:18.431404 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 31 09:04:18 crc kubenswrapper[4732]: E0131 09:04:18.431606 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-88rk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-d7ngt_openshift-marketplace(3d6ffd83-fb99-48e0-a34a-fd365f971ef1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:04:18 crc kubenswrapper[4732]: E0131 09:04:18.433136 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-d7ngt" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.546353 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.546628 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90ec082-a189-4726-8049-2151ddf77961-kube-api-access\") pod \"installer-9-crc\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.546709 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-var-lock\") pod \"installer-9-crc\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.647861 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90ec082-a189-4726-8049-2151ddf77961-kube-api-access\") pod \"installer-9-crc\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.647923 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-var-lock\") pod \"installer-9-crc\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.647990 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.648089 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-kubelet-dir\") pod \"installer-9-crc\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.648081 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-var-lock\") pod \"installer-9-crc\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.674787 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90ec082-a189-4726-8049-2151ddf77961-kube-api-access\") pod \"installer-9-crc\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:18 crc kubenswrapper[4732]: I0131 09:04:18.702598 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:04:20 crc kubenswrapper[4732]: E0131 09:04:20.209195 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-d7ngt" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" Jan 31 09:04:20 crc kubenswrapper[4732]: E0131 09:04:20.209576 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zm4tc" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" Jan 31 09:04:20 crc kubenswrapper[4732]: E0131 09:04:20.330501 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 31 09:04:20 crc kubenswrapper[4732]: E0131 09:04:20.330751 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-654wm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rtg8l_openshift-marketplace(320c2656-6f30-4922-835e-8c27a82800b1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:04:20 crc kubenswrapper[4732]: E0131 09:04:20.331920 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rtg8l" podUID="320c2656-6f30-4922-835e-8c27a82800b1" Jan 31 09:04:20 crc kubenswrapper[4732]: E0131 09:04:20.514378 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 31 09:04:20 crc kubenswrapper[4732]: E0131 09:04:20.514561 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nm96d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2jzzz_openshift-marketplace(60fab354-a742-4d49-88d9-22843a857ea5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:04:20 crc kubenswrapper[4732]: E0131 09:04:20.515782 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2jzzz" podUID="60fab354-a742-4d49-88d9-22843a857ea5" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.381928 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rtg8l" podUID="320c2656-6f30-4922-835e-8c27a82800b1" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.382351 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2jzzz" podUID="60fab354-a742-4d49-88d9-22843a857ea5" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.519641 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.519901 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-27rqc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gb54f_openshift-marketplace(111ca852-fddd-4fb1-8d5d-331fd5921a71): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.523177 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gb54f" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.622646 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.622870 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2dzc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-vdcdv_openshift-marketplace(b03cae03-72c1-4b13-8031-33381e6df48a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.624070 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-vdcdv" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.715504 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.715689 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2f4q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6z6vm_openshift-marketplace(7006b68f-caf9-44a9-a6df-26e7b594b931): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:04:23 crc kubenswrapper[4732]: E0131 09:04:23.717010 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6z6vm" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" Jan 31 09:04:23 crc kubenswrapper[4732]: I0131 09:04:23.844279 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 09:04:23 crc kubenswrapper[4732]: I0131 09:04:23.907027 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 09:04:23 crc kubenswrapper[4732]: W0131 09:04:23.917479 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode90ec082_a189_4726_8049_2151ddf77961.slice/crio-68aff39b648c97e8da52037b14095371b6089843216d06b0916075106acf4b04 WatchSource:0}: Error finding container 68aff39b648c97e8da52037b14095371b6089843216d06b0916075106acf4b04: Status 404 returned error can't find the container with id 68aff39b648c97e8da52037b14095371b6089843216d06b0916075106acf4b04 Jan 31 09:04:24 crc kubenswrapper[4732]: I0131 09:04:24.027351 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7497adea-5f95-433f-b644-9aa3eae85937","Type":"ContainerStarted","Data":"6feeea57ea5b40e213efe92826dc377f85d1355d80c79473b3d790d2a355624b"} Jan 31 09:04:24 crc kubenswrapper[4732]: I0131 09:04:24.028874 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e90ec082-a189-4726-8049-2151ddf77961","Type":"ContainerStarted","Data":"68aff39b648c97e8da52037b14095371b6089843216d06b0916075106acf4b04"} Jan 31 09:04:24 crc kubenswrapper[4732]: E0131 09:04:24.030217 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6z6vm" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" Jan 31 09:04:24 crc kubenswrapper[4732]: E0131 09:04:24.031450 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gb54f" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" Jan 31 09:04:24 crc kubenswrapper[4732]: E0131 09:04:24.031688 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-vdcdv" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" Jan 31 09:04:24 crc kubenswrapper[4732]: E0131 09:04:24.307061 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 31 09:04:24 crc kubenswrapper[4732]: E0131 09:04:24.307629 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p95zf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zflvq_openshift-marketplace(317b5076-0f62-45e5-9db0-8d03103c990e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 09:04:24 crc kubenswrapper[4732]: E0131 09:04:24.308837 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zflvq" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" Jan 31 09:04:25 crc kubenswrapper[4732]: I0131 09:04:25.035027 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e90ec082-a189-4726-8049-2151ddf77961","Type":"ContainerStarted","Data":"23b1edb2efd137d80917cfc98d36d9f6d054406b735c02e22b781cbf0e7d7c9e"} Jan 31 09:04:25 crc kubenswrapper[4732]: I0131 09:04:25.038249 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7497adea-5f95-433f-b644-9aa3eae85937","Type":"ContainerStarted","Data":"1bdec52283dba4c8385c51f2ac489c9b0b6a615f0a924e079199dd135ef936d5"} Jan 31 09:04:25 crc kubenswrapper[4732]: E0131 09:04:25.039585 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zflvq" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" Jan 31 09:04:25 crc kubenswrapper[4732]: I0131 09:04:25.051914 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.051890452 podStartE2EDuration="7.051890452s" podCreationTimestamp="2026-01-31 09:04:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:04:25.050290358 +0000 UTC m=+203.356166562" watchObservedRunningTime="2026-01-31 09:04:25.051890452 +0000 UTC m=+203.357766656" Jan 31 09:04:25 crc kubenswrapper[4732]: I0131 09:04:25.075318 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=11.075294041 podStartE2EDuration="11.075294041s" podCreationTimestamp="2026-01-31 09:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:04:25.069741444 +0000 UTC m=+203.375617648" watchObservedRunningTime="2026-01-31 09:04:25.075294041 +0000 UTC m=+203.381170245" Jan 31 09:04:26 crc kubenswrapper[4732]: I0131 09:04:26.046084 4732 generic.go:334] "Generic (PLEG): container finished" podID="7497adea-5f95-433f-b644-9aa3eae85937" containerID="1bdec52283dba4c8385c51f2ac489c9b0b6a615f0a924e079199dd135ef936d5" exitCode=0 Jan 31 09:04:26 crc kubenswrapper[4732]: I0131 09:04:26.046168 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7497adea-5f95-433f-b644-9aa3eae85937","Type":"ContainerDied","Data":"1bdec52283dba4c8385c51f2ac489c9b0b6a615f0a924e079199dd135ef936d5"} Jan 31 09:04:27 crc kubenswrapper[4732]: I0131 09:04:27.478247 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:27 crc kubenswrapper[4732]: I0131 09:04:27.571988 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7497adea-5f95-433f-b644-9aa3eae85937-kubelet-dir\") pod \"7497adea-5f95-433f-b644-9aa3eae85937\" (UID: \"7497adea-5f95-433f-b644-9aa3eae85937\") " Jan 31 09:04:27 crc kubenswrapper[4732]: I0131 09:04:27.572075 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7497adea-5f95-433f-b644-9aa3eae85937-kube-api-access\") pod \"7497adea-5f95-433f-b644-9aa3eae85937\" (UID: \"7497adea-5f95-433f-b644-9aa3eae85937\") " Jan 31 09:04:27 crc kubenswrapper[4732]: I0131 09:04:27.572183 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7497adea-5f95-433f-b644-9aa3eae85937-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7497adea-5f95-433f-b644-9aa3eae85937" (UID: "7497adea-5f95-433f-b644-9aa3eae85937"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:04:27 crc kubenswrapper[4732]: I0131 09:04:27.572419 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7497adea-5f95-433f-b644-9aa3eae85937-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:27 crc kubenswrapper[4732]: I0131 09:04:27.577552 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7497adea-5f95-433f-b644-9aa3eae85937-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7497adea-5f95-433f-b644-9aa3eae85937" (UID: "7497adea-5f95-433f-b644-9aa3eae85937"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:27 crc kubenswrapper[4732]: I0131 09:04:27.673500 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7497adea-5f95-433f-b644-9aa3eae85937-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:28 crc kubenswrapper[4732]: I0131 09:04:28.062302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7497adea-5f95-433f-b644-9aa3eae85937","Type":"ContainerDied","Data":"6feeea57ea5b40e213efe92826dc377f85d1355d80c79473b3d790d2a355624b"} Jan 31 09:04:28 crc kubenswrapper[4732]: I0131 09:04:28.062745 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6feeea57ea5b40e213efe92826dc377f85d1355d80c79473b3d790d2a355624b" Jan 31 09:04:28 crc kubenswrapper[4732]: I0131 09:04:28.062415 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 09:04:33 crc kubenswrapper[4732]: I0131 09:04:33.096206 4732 generic.go:334] "Generic (PLEG): container finished" podID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerID="ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b" exitCode=0 Jan 31 09:04:33 crc kubenswrapper[4732]: I0131 09:04:33.096305 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7ngt" event={"ID":"3d6ffd83-fb99-48e0-a34a-fd365f971ef1","Type":"ContainerDied","Data":"ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b"} Jan 31 09:04:34 crc kubenswrapper[4732]: I0131 09:04:34.104556 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm4tc" event={"ID":"21393f97-49f1-4f27-a24c-93f88fe6596b","Type":"ContainerStarted","Data":"056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa"} Jan 31 09:04:34 crc kubenswrapper[4732]: I0131 09:04:34.107452 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7ngt" event={"ID":"3d6ffd83-fb99-48e0-a34a-fd365f971ef1","Type":"ContainerStarted","Data":"32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc"} Jan 31 09:04:34 crc kubenswrapper[4732]: I0131 09:04:34.144492 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d7ngt" podStartSLOduration=4.107056758 podStartE2EDuration="59.144466527s" podCreationTimestamp="2026-01-31 09:03:35 +0000 UTC" firstStartedPulling="2026-01-31 09:03:38.62184422 +0000 UTC m=+156.927720424" lastFinishedPulling="2026-01-31 09:04:33.659253989 +0000 UTC m=+211.965130193" observedRunningTime="2026-01-31 09:04:34.141148185 +0000 UTC m=+212.447024399" watchObservedRunningTime="2026-01-31 09:04:34.144466527 +0000 UTC m=+212.450342731" Jan 31 09:04:35 crc kubenswrapper[4732]: I0131 09:04:35.120105 4732 generic.go:334] "Generic (PLEG): container finished" podID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerID="056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa" exitCode=0 Jan 31 09:04:35 crc kubenswrapper[4732]: I0131 09:04:35.120182 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm4tc" event={"ID":"21393f97-49f1-4f27-a24c-93f88fe6596b","Type":"ContainerDied","Data":"056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa"} Jan 31 09:04:35 crc kubenswrapper[4732]: I0131 09:04:35.822710 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:04:35 crc kubenswrapper[4732]: I0131 09:04:35.823053 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:04:35 crc kubenswrapper[4732]: I0131 09:04:35.958170 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:04:36 crc kubenswrapper[4732]: I0131 09:04:36.127266 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm4tc" event={"ID":"21393f97-49f1-4f27-a24c-93f88fe6596b","Type":"ContainerStarted","Data":"e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42"} Jan 31 09:04:36 crc kubenswrapper[4732]: I0131 09:04:36.129955 4732 generic.go:334] "Generic (PLEG): container finished" podID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerID="3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd" exitCode=0 Jan 31 09:04:36 crc kubenswrapper[4732]: I0131 09:04:36.130016 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb54f" event={"ID":"111ca852-fddd-4fb1-8d5d-331fd5921a71","Type":"ContainerDied","Data":"3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd"} Jan 31 09:04:36 crc kubenswrapper[4732]: I0131 09:04:36.133096 4732 generic.go:334] "Generic (PLEG): container finished" podID="60fab354-a742-4d49-88d9-22843a857ea5" containerID="dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248" exitCode=0 Jan 31 09:04:36 crc kubenswrapper[4732]: I0131 09:04:36.133130 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jzzz" event={"ID":"60fab354-a742-4d49-88d9-22843a857ea5","Type":"ContainerDied","Data":"dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248"} Jan 31 09:04:36 crc kubenswrapper[4732]: I0131 09:04:36.145570 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zm4tc" podStartSLOduration=2.195853665 podStartE2EDuration="59.145547288s" podCreationTimestamp="2026-01-31 09:03:37 +0000 UTC" firstStartedPulling="2026-01-31 09:03:38.607527659 +0000 UTC m=+156.913403863" lastFinishedPulling="2026-01-31 09:04:35.557221282 +0000 UTC m=+213.863097486" observedRunningTime="2026-01-31 09:04:36.14441682 +0000 UTC m=+214.450293044" watchObservedRunningTime="2026-01-31 09:04:36.145547288 +0000 UTC m=+214.451423492" Jan 31 09:04:37 crc kubenswrapper[4732]: I0131 09:04:37.140883 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb54f" event={"ID":"111ca852-fddd-4fb1-8d5d-331fd5921a71","Type":"ContainerStarted","Data":"ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a"} Jan 31 09:04:37 crc kubenswrapper[4732]: I0131 09:04:37.143277 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jzzz" event={"ID":"60fab354-a742-4d49-88d9-22843a857ea5","Type":"ContainerStarted","Data":"2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82"} Jan 31 09:04:37 crc kubenswrapper[4732]: I0131 09:04:37.145776 4732 generic.go:334] "Generic (PLEG): container finished" podID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerID="57ef0bf4e5dc7d6762819c0d28bec5f496f13d673b9961467d54456931c326d3" exitCode=0 Jan 31 09:04:37 crc kubenswrapper[4732]: I0131 09:04:37.145823 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z6vm" event={"ID":"7006b68f-caf9-44a9-a6df-26e7b594b931","Type":"ContainerDied","Data":"57ef0bf4e5dc7d6762819c0d28bec5f496f13d673b9961467d54456931c326d3"} Jan 31 09:04:37 crc kubenswrapper[4732]: I0131 09:04:37.162977 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gb54f" podStartSLOduration=3.116631031 podStartE2EDuration="1m4.162951668s" podCreationTimestamp="2026-01-31 09:03:33 +0000 UTC" firstStartedPulling="2026-01-31 09:03:35.554412596 +0000 UTC m=+153.860288800" lastFinishedPulling="2026-01-31 09:04:36.600733233 +0000 UTC m=+214.906609437" observedRunningTime="2026-01-31 09:04:37.162378478 +0000 UTC m=+215.468254692" watchObservedRunningTime="2026-01-31 09:04:37.162951668 +0000 UTC m=+215.468827872" Jan 31 09:04:37 crc kubenswrapper[4732]: I0131 09:04:37.215069 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2jzzz" podStartSLOduration=4.187002837 podStartE2EDuration="1m2.215046935s" podCreationTimestamp="2026-01-31 09:03:35 +0000 UTC" firstStartedPulling="2026-01-31 09:03:38.601698693 +0000 UTC m=+156.907574887" lastFinishedPulling="2026-01-31 09:04:36.629742781 +0000 UTC m=+214.935618985" observedRunningTime="2026-01-31 09:04:37.206964952 +0000 UTC m=+215.512841156" watchObservedRunningTime="2026-01-31 09:04:37.215046935 +0000 UTC m=+215.520923139" Jan 31 09:04:37 crc kubenswrapper[4732]: I0131 09:04:37.439508 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:04:37 crc kubenswrapper[4732]: I0131 09:04:37.439573 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:04:38 crc kubenswrapper[4732]: I0131 09:04:38.152651 4732 generic.go:334] "Generic (PLEG): container finished" podID="317b5076-0f62-45e5-9db0-8d03103c990e" containerID="a394a4a7aba70397e2142712f48785a38bdafa854de238e63e52f068af8200df" exitCode=0 Jan 31 09:04:38 crc kubenswrapper[4732]: I0131 09:04:38.152725 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zflvq" event={"ID":"317b5076-0f62-45e5-9db0-8d03103c990e","Type":"ContainerDied","Data":"a394a4a7aba70397e2142712f48785a38bdafa854de238e63e52f068af8200df"} Jan 31 09:04:38 crc kubenswrapper[4732]: I0131 09:04:38.155425 4732 generic.go:334] "Generic (PLEG): container finished" podID="b03cae03-72c1-4b13-8031-33381e6df48a" containerID="cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123" exitCode=0 Jan 31 09:04:38 crc kubenswrapper[4732]: I0131 09:04:38.155468 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdcdv" event={"ID":"b03cae03-72c1-4b13-8031-33381e6df48a","Type":"ContainerDied","Data":"cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123"} Jan 31 09:04:38 crc kubenswrapper[4732]: I0131 09:04:38.159155 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z6vm" event={"ID":"7006b68f-caf9-44a9-a6df-26e7b594b931","Type":"ContainerStarted","Data":"0f914774fa96054fd035bd65f6de0e6eceaa4af864e8f50ebeec42c0731c97cc"} Jan 31 09:04:38 crc kubenswrapper[4732]: I0131 09:04:38.161155 4732 generic.go:334] "Generic (PLEG): container finished" podID="320c2656-6f30-4922-835e-8c27a82800b1" containerID="7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86" exitCode=0 Jan 31 09:04:38 crc kubenswrapper[4732]: I0131 09:04:38.161220 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtg8l" event={"ID":"320c2656-6f30-4922-835e-8c27a82800b1","Type":"ContainerDied","Data":"7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86"} Jan 31 09:04:38 crc kubenswrapper[4732]: I0131 09:04:38.196531 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6z6vm" podStartSLOduration=2.162773319 podStartE2EDuration="1m4.196505692s" podCreationTimestamp="2026-01-31 09:03:34 +0000 UTC" firstStartedPulling="2026-01-31 09:03:35.531329309 +0000 UTC m=+153.837205523" lastFinishedPulling="2026-01-31 09:04:37.565061692 +0000 UTC m=+215.870937896" observedRunningTime="2026-01-31 09:04:38.194927678 +0000 UTC m=+216.500803882" watchObservedRunningTime="2026-01-31 09:04:38.196505692 +0000 UTC m=+216.502381896" Jan 31 09:04:38 crc kubenswrapper[4732]: I0131 09:04:38.483248 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zm4tc" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerName="registry-server" probeResult="failure" output=< Jan 31 09:04:38 crc kubenswrapper[4732]: timeout: failed to connect service ":50051" within 1s Jan 31 09:04:38 crc kubenswrapper[4732]: > Jan 31 09:04:39 crc kubenswrapper[4732]: I0131 09:04:39.169195 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtg8l" event={"ID":"320c2656-6f30-4922-835e-8c27a82800b1","Type":"ContainerStarted","Data":"c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d"} Jan 31 09:04:39 crc kubenswrapper[4732]: I0131 09:04:39.172174 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zflvq" event={"ID":"317b5076-0f62-45e5-9db0-8d03103c990e","Type":"ContainerStarted","Data":"55d6734cf3e7c05d22db17a72a71b78b94e9eb31ed346ef88787ef99a88abf99"} Jan 31 09:04:39 crc kubenswrapper[4732]: I0131 09:04:39.175476 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdcdv" event={"ID":"b03cae03-72c1-4b13-8031-33381e6df48a","Type":"ContainerStarted","Data":"3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6"} Jan 31 09:04:39 crc kubenswrapper[4732]: I0131 09:04:39.193837 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rtg8l" podStartSLOduration=3.137973783 podStartE2EDuration="1m6.193812163s" podCreationTimestamp="2026-01-31 09:03:33 +0000 UTC" firstStartedPulling="2026-01-31 09:03:35.535910444 +0000 UTC m=+153.841786648" lastFinishedPulling="2026-01-31 09:04:38.591748824 +0000 UTC m=+216.897625028" observedRunningTime="2026-01-31 09:04:39.189324892 +0000 UTC m=+217.495201106" watchObservedRunningTime="2026-01-31 09:04:39.193812163 +0000 UTC m=+217.499688367" Jan 31 09:04:39 crc kubenswrapper[4732]: I0131 09:04:39.210755 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vdcdv" podStartSLOduration=3.107831587 podStartE2EDuration="1m3.210735784s" podCreationTimestamp="2026-01-31 09:03:36 +0000 UTC" firstStartedPulling="2026-01-31 09:03:38.595919979 +0000 UTC m=+156.901796183" lastFinishedPulling="2026-01-31 09:04:38.698824166 +0000 UTC m=+217.004700380" observedRunningTime="2026-01-31 09:04:39.207081211 +0000 UTC m=+217.512957415" watchObservedRunningTime="2026-01-31 09:04:39.210735784 +0000 UTC m=+217.516611988" Jan 31 09:04:39 crc kubenswrapper[4732]: I0131 09:04:39.226263 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zflvq" podStartSLOduration=3.147929828 podStartE2EDuration="1m6.226245788s" podCreationTimestamp="2026-01-31 09:03:33 +0000 UTC" firstStartedPulling="2026-01-31 09:03:35.543504999 +0000 UTC m=+153.849381203" lastFinishedPulling="2026-01-31 09:04:38.621820959 +0000 UTC m=+216.927697163" observedRunningTime="2026-01-31 09:04:39.225683339 +0000 UTC m=+217.531559543" watchObservedRunningTime="2026-01-31 09:04:39.226245788 +0000 UTC m=+217.532121992" Jan 31 09:04:43 crc kubenswrapper[4732]: I0131 09:04:43.872366 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:04:43 crc kubenswrapper[4732]: I0131 09:04:43.873558 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:04:43 crc kubenswrapper[4732]: I0131 09:04:43.919017 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.023693 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.023959 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.076826 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.245140 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.245672 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.245802 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.259101 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.299774 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.458867 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.458918 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:04:44 crc kubenswrapper[4732]: I0131 09:04:44.501439 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:04:45 crc kubenswrapper[4732]: I0131 09:04:45.245903 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:04:45 crc kubenswrapper[4732]: I0131 09:04:45.249436 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:04:45 crc kubenswrapper[4732]: I0131 09:04:45.861803 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:04:46 crc kubenswrapper[4732]: I0131 09:04:46.274770 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:04:46 crc kubenswrapper[4732]: I0131 09:04:46.274833 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:04:46 crc kubenswrapper[4732]: I0131 09:04:46.310577 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:04:46 crc kubenswrapper[4732]: I0131 09:04:46.384899 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6z6vm"] Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.030904 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.031234 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.070883 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.217338 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6z6vm" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerName="registry-server" containerID="cri-o://0f914774fa96054fd035bd65f6de0e6eceaa4af864e8f50ebeec42c0731c97cc" gracePeriod=2 Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.252056 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.252573 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.383743 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zflvq"] Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.482400 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.499448 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.499530 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.499579 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.503405 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5"} pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.503601 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" containerID="cri-o://ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5" gracePeriod=600 Jan 31 09:04:47 crc kubenswrapper[4732]: I0131 09:04:47.528642 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:04:48 crc kubenswrapper[4732]: I0131 09:04:48.222320 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zflvq" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" containerName="registry-server" containerID="cri-o://55d6734cf3e7c05d22db17a72a71b78b94e9eb31ed346ef88787ef99a88abf99" gracePeriod=2 Jan 31 09:04:48 crc kubenswrapper[4732]: I0131 09:04:48.783681 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jzzz"] Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.231692 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d790207-d357-4b47-87bf-5b505e061820" containerID="ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5" exitCode=0 Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.231853 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerDied","Data":"ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5"} Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.232067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"1a1af67f6e9c90030eed50fdab77c62259e76a7813864bb504390768e9501756"} Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.235639 4732 generic.go:334] "Generic (PLEG): container finished" podID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerID="0f914774fa96054fd035bd65f6de0e6eceaa4af864e8f50ebeec42c0731c97cc" exitCode=0 Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.235713 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z6vm" event={"ID":"7006b68f-caf9-44a9-a6df-26e7b594b931","Type":"ContainerDied","Data":"0f914774fa96054fd035bd65f6de0e6eceaa4af864e8f50ebeec42c0731c97cc"} Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.241035 4732 generic.go:334] "Generic (PLEG): container finished" podID="317b5076-0f62-45e5-9db0-8d03103c990e" containerID="55d6734cf3e7c05d22db17a72a71b78b94e9eb31ed346ef88787ef99a88abf99" exitCode=0 Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.241115 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zflvq" event={"ID":"317b5076-0f62-45e5-9db0-8d03103c990e","Type":"ContainerDied","Data":"55d6734cf3e7c05d22db17a72a71b78b94e9eb31ed346ef88787ef99a88abf99"} Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.241355 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2jzzz" podUID="60fab354-a742-4d49-88d9-22843a857ea5" containerName="registry-server" containerID="cri-o://2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82" gracePeriod=2 Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.584351 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.661713 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.666367 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2f4q\" (UniqueName: \"kubernetes.io/projected/7006b68f-caf9-44a9-a6df-26e7b594b931-kube-api-access-m2f4q\") pod \"7006b68f-caf9-44a9-a6df-26e7b594b931\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.666498 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-utilities\") pod \"7006b68f-caf9-44a9-a6df-26e7b594b931\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.666540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-catalog-content\") pod \"7006b68f-caf9-44a9-a6df-26e7b594b931\" (UID: \"7006b68f-caf9-44a9-a6df-26e7b594b931\") " Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.671709 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7006b68f-caf9-44a9-a6df-26e7b594b931-kube-api-access-m2f4q" (OuterVolumeSpecName: "kube-api-access-m2f4q") pod "7006b68f-caf9-44a9-a6df-26e7b594b931" (UID: "7006b68f-caf9-44a9-a6df-26e7b594b931"). InnerVolumeSpecName "kube-api-access-m2f4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.683650 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-utilities" (OuterVolumeSpecName: "utilities") pod "7006b68f-caf9-44a9-a6df-26e7b594b931" (UID: "7006b68f-caf9-44a9-a6df-26e7b594b931"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.719121 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7006b68f-caf9-44a9-a6df-26e7b594b931" (UID: "7006b68f-caf9-44a9-a6df-26e7b594b931"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.768494 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm96d\" (UniqueName: \"kubernetes.io/projected/60fab354-a742-4d49-88d9-22843a857ea5-kube-api-access-nm96d\") pod \"60fab354-a742-4d49-88d9-22843a857ea5\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.768569 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-catalog-content\") pod \"60fab354-a742-4d49-88d9-22843a857ea5\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.768693 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-utilities\") pod \"60fab354-a742-4d49-88d9-22843a857ea5\" (UID: \"60fab354-a742-4d49-88d9-22843a857ea5\") " Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.770877 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-utilities" (OuterVolumeSpecName: "utilities") pod "60fab354-a742-4d49-88d9-22843a857ea5" (UID: "60fab354-a742-4d49-88d9-22843a857ea5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.771360 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.771375 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7006b68f-caf9-44a9-a6df-26e7b594b931-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.771391 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2f4q\" (UniqueName: \"kubernetes.io/projected/7006b68f-caf9-44a9-a6df-26e7b594b931-kube-api-access-m2f4q\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.771409 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.775477 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60fab354-a742-4d49-88d9-22843a857ea5-kube-api-access-nm96d" (OuterVolumeSpecName: "kube-api-access-nm96d") pod "60fab354-a742-4d49-88d9-22843a857ea5" (UID: "60fab354-a742-4d49-88d9-22843a857ea5"). InnerVolumeSpecName "kube-api-access-nm96d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.806431 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60fab354-a742-4d49-88d9-22843a857ea5" (UID: "60fab354-a742-4d49-88d9-22843a857ea5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.873113 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm96d\" (UniqueName: \"kubernetes.io/projected/60fab354-a742-4d49-88d9-22843a857ea5-kube-api-access-nm96d\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.873156 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60fab354-a742-4d49-88d9-22843a857ea5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:49 crc kubenswrapper[4732]: I0131 09:04:49.972678 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.076219 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-utilities\") pod \"317b5076-0f62-45e5-9db0-8d03103c990e\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.076331 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p95zf\" (UniqueName: \"kubernetes.io/projected/317b5076-0f62-45e5-9db0-8d03103c990e-kube-api-access-p95zf\") pod \"317b5076-0f62-45e5-9db0-8d03103c990e\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.076419 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-catalog-content\") pod \"317b5076-0f62-45e5-9db0-8d03103c990e\" (UID: \"317b5076-0f62-45e5-9db0-8d03103c990e\") " Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.077675 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-utilities" (OuterVolumeSpecName: "utilities") pod "317b5076-0f62-45e5-9db0-8d03103c990e" (UID: "317b5076-0f62-45e5-9db0-8d03103c990e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.080250 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317b5076-0f62-45e5-9db0-8d03103c990e-kube-api-access-p95zf" (OuterVolumeSpecName: "kube-api-access-p95zf") pod "317b5076-0f62-45e5-9db0-8d03103c990e" (UID: "317b5076-0f62-45e5-9db0-8d03103c990e"). InnerVolumeSpecName "kube-api-access-p95zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.134385 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "317b5076-0f62-45e5-9db0-8d03103c990e" (UID: "317b5076-0f62-45e5-9db0-8d03103c990e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.177607 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p95zf\" (UniqueName: \"kubernetes.io/projected/317b5076-0f62-45e5-9db0-8d03103c990e-kube-api-access-p95zf\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.177854 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.177950 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/317b5076-0f62-45e5-9db0-8d03103c990e-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.249290 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zflvq" event={"ID":"317b5076-0f62-45e5-9db0-8d03103c990e","Type":"ContainerDied","Data":"e3e3c2c80d6bcc6eb2c4ab51462ea9a6f6a150eb54a71c100214dd3061f85429"} Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.249354 4732 scope.go:117] "RemoveContainer" containerID="55d6734cf3e7c05d22db17a72a71b78b94e9eb31ed346ef88787ef99a88abf99" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.249353 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zflvq" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.252525 4732 generic.go:334] "Generic (PLEG): container finished" podID="60fab354-a742-4d49-88d9-22843a857ea5" containerID="2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82" exitCode=0 Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.252604 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jzzz" event={"ID":"60fab354-a742-4d49-88d9-22843a857ea5","Type":"ContainerDied","Data":"2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82"} Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.252638 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2jzzz" event={"ID":"60fab354-a742-4d49-88d9-22843a857ea5","Type":"ContainerDied","Data":"9ea503aa2d078ea5c77d132dad08c175b5d8c3c762d663e871a3d782440f1d6e"} Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.252753 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2jzzz" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.257476 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6z6vm" event={"ID":"7006b68f-caf9-44a9-a6df-26e7b594b931","Type":"ContainerDied","Data":"57ca1e232ee4486e6e4d54b1df45d6e963d8c6e1b497c54651b39471c8475a9c"} Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.257569 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6z6vm" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.274771 4732 scope.go:117] "RemoveContainer" containerID="a394a4a7aba70397e2142712f48785a38bdafa854de238e63e52f068af8200df" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.286368 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jzzz"] Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.288656 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2jzzz"] Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.299155 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zflvq"] Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.302690 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zflvq"] Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.311715 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6z6vm"] Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.314902 4732 scope.go:117] "RemoveContainer" containerID="d955c74da8ffd285d20400dc34dfd51736c439bd9e7f63e99e5270665cbbadb8" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.316781 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6z6vm"] Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.327571 4732 scope.go:117] "RemoveContainer" containerID="2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.343881 4732 scope.go:117] "RemoveContainer" containerID="dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.358764 4732 scope.go:117] "RemoveContainer" containerID="5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.380692 4732 scope.go:117] "RemoveContainer" containerID="2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82" Jan 31 09:04:50 crc kubenswrapper[4732]: E0131 09:04:50.381283 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82\": container with ID starting with 2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82 not found: ID does not exist" containerID="2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.381319 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82"} err="failed to get container status \"2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82\": rpc error: code = NotFound desc = could not find container \"2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82\": container with ID starting with 2c08da5c8ccbc31693fa4ac757f74d0642d9c7b6beaf27556c5171f34d657f82 not found: ID does not exist" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.381372 4732 scope.go:117] "RemoveContainer" containerID="dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248" Jan 31 09:04:50 crc kubenswrapper[4732]: E0131 09:04:50.381998 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248\": container with ID starting with dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248 not found: ID does not exist" containerID="dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.382042 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248"} err="failed to get container status \"dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248\": rpc error: code = NotFound desc = could not find container \"dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248\": container with ID starting with dcb61f969a4ae5bab5da17004391ab9facb17348431592ecfb379299b5c6a248 not found: ID does not exist" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.382079 4732 scope.go:117] "RemoveContainer" containerID="5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3" Jan 31 09:04:50 crc kubenswrapper[4732]: E0131 09:04:50.382487 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3\": container with ID starting with 5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3 not found: ID does not exist" containerID="5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.382517 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3"} err="failed to get container status \"5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3\": rpc error: code = NotFound desc = could not find container \"5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3\": container with ID starting with 5530625f265d27e331890e3d6915df6f40e55b5333079f815cda7a875fb373a3 not found: ID does not exist" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.382538 4732 scope.go:117] "RemoveContainer" containerID="0f914774fa96054fd035bd65f6de0e6eceaa4af864e8f50ebeec42c0731c97cc" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.405072 4732 scope.go:117] "RemoveContainer" containerID="57ef0bf4e5dc7d6762819c0d28bec5f496f13d673b9961467d54456931c326d3" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.420888 4732 scope.go:117] "RemoveContainer" containerID="b38eca76de0c7fcb760be9dbb97b202ed5f6069cdb395b15ae3017074d198d5e" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.548655 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" path="/var/lib/kubelet/pods/317b5076-0f62-45e5-9db0-8d03103c990e/volumes" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.549368 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60fab354-a742-4d49-88d9-22843a857ea5" path="/var/lib/kubelet/pods/60fab354-a742-4d49-88d9-22843a857ea5/volumes" Jan 31 09:04:50 crc kubenswrapper[4732]: I0131 09:04:50.550012 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" path="/var/lib/kubelet/pods/7006b68f-caf9-44a9-a6df-26e7b594b931/volumes" Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.191064 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zm4tc"] Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.191982 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zm4tc" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerName="registry-server" containerID="cri-o://e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42" gracePeriod=2 Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.588527 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.697839 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-utilities\") pod \"21393f97-49f1-4f27-a24c-93f88fe6596b\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.697884 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-catalog-content\") pod \"21393f97-49f1-4f27-a24c-93f88fe6596b\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.697938 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzkj6\" (UniqueName: \"kubernetes.io/projected/21393f97-49f1-4f27-a24c-93f88fe6596b-kube-api-access-rzkj6\") pod \"21393f97-49f1-4f27-a24c-93f88fe6596b\" (UID: \"21393f97-49f1-4f27-a24c-93f88fe6596b\") " Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.699110 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-utilities" (OuterVolumeSpecName: "utilities") pod "21393f97-49f1-4f27-a24c-93f88fe6596b" (UID: "21393f97-49f1-4f27-a24c-93f88fe6596b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.703289 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21393f97-49f1-4f27-a24c-93f88fe6596b-kube-api-access-rzkj6" (OuterVolumeSpecName: "kube-api-access-rzkj6") pod "21393f97-49f1-4f27-a24c-93f88fe6596b" (UID: "21393f97-49f1-4f27-a24c-93f88fe6596b"). InnerVolumeSpecName "kube-api-access-rzkj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.799341 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.799385 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzkj6\" (UniqueName: \"kubernetes.io/projected/21393f97-49f1-4f27-a24c-93f88fe6596b-kube-api-access-rzkj6\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.827616 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21393f97-49f1-4f27-a24c-93f88fe6596b" (UID: "21393f97-49f1-4f27-a24c-93f88fe6596b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:51 crc kubenswrapper[4732]: I0131 09:04:51.900191 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21393f97-49f1-4f27-a24c-93f88fe6596b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.272404 4732 generic.go:334] "Generic (PLEG): container finished" podID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerID="e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42" exitCode=0 Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.272484 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zm4tc" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.272519 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm4tc" event={"ID":"21393f97-49f1-4f27-a24c-93f88fe6596b","Type":"ContainerDied","Data":"e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42"} Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.272914 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zm4tc" event={"ID":"21393f97-49f1-4f27-a24c-93f88fe6596b","Type":"ContainerDied","Data":"72ec2ce1dae672abaa2d0accf204fcab615644a223c21af740bfdd14a5e3a998"} Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.272934 4732 scope.go:117] "RemoveContainer" containerID="e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.289060 4732 scope.go:117] "RemoveContainer" containerID="056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.301455 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zm4tc"] Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.305009 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zm4tc"] Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.329165 4732 scope.go:117] "RemoveContainer" containerID="b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.344726 4732 scope.go:117] "RemoveContainer" containerID="e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42" Jan 31 09:04:52 crc kubenswrapper[4732]: E0131 09:04:52.345213 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42\": container with ID starting with e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42 not found: ID does not exist" containerID="e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.345287 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42"} err="failed to get container status \"e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42\": rpc error: code = NotFound desc = could not find container \"e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42\": container with ID starting with e62da55b95d81d1296cd0d4dc32e38fc2f1cdae4524faa40e7adaa74ec4b6e42 not found: ID does not exist" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.345318 4732 scope.go:117] "RemoveContainer" containerID="056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa" Jan 31 09:04:52 crc kubenswrapper[4732]: E0131 09:04:52.345691 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa\": container with ID starting with 056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa not found: ID does not exist" containerID="056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.345722 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa"} err="failed to get container status \"056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa\": rpc error: code = NotFound desc = could not find container \"056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa\": container with ID starting with 056f06710fefcf74e3a8f3ec56d61c726350593a141e241038ff15c0bdb571fa not found: ID does not exist" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.345744 4732 scope.go:117] "RemoveContainer" containerID="b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867" Jan 31 09:04:52 crc kubenswrapper[4732]: E0131 09:04:52.346032 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867\": container with ID starting with b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867 not found: ID does not exist" containerID="b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.346064 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867"} err="failed to get container status \"b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867\": rpc error: code = NotFound desc = could not find container \"b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867\": container with ID starting with b61b2dcac543f0ef563e0d943499efa08ba5e1c076661ef3c5b44487b1cd0867 not found: ID does not exist" Jan 31 09:04:52 crc kubenswrapper[4732]: I0131 09:04:52.558191 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" path="/var/lib/kubelet/pods/21393f97-49f1-4f27-a24c-93f88fe6596b/volumes" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.098002 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-c8t6l"] Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.445864 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gb54f"] Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.446259 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gb54f" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerName="registry-server" containerID="cri-o://ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a" gracePeriod=30 Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.463278 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rtg8l"] Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.463534 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rtg8l" podUID="320c2656-6f30-4922-835e-8c27a82800b1" containerName="registry-server" containerID="cri-o://c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d" gracePeriod=30 Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.479757 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ljds4"] Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.480041 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" podUID="f4d0ed50-aa9b-4a62-b340-882ddf73f008" containerName="marketplace-operator" containerID="cri-o://393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183" gracePeriod=30 Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.491081 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7ngt"] Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.491375 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d7ngt" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerName="registry-server" containerID="cri-o://32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc" gracePeriod=30 Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.497863 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vdcdv"] Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.498114 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vdcdv" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" containerName="registry-server" containerID="cri-o://3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6" gracePeriod=30 Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.501988 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8bchs"] Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502275 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerName="extract-utilities" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502297 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerName="extract-utilities" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502311 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerName="extract-utilities" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502320 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerName="extract-utilities" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502331 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerName="extract-content" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502340 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerName="extract-content" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502349 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" containerName="extract-utilities" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502358 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" containerName="extract-utilities" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502372 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fab354-a742-4d49-88d9-22843a857ea5" containerName="extract-utilities" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502381 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fab354-a742-4d49-88d9-22843a857ea5" containerName="extract-utilities" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502396 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7497adea-5f95-433f-b644-9aa3eae85937" containerName="pruner" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502404 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7497adea-5f95-433f-b644-9aa3eae85937" containerName="pruner" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502416 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502423 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502434 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerName="extract-content" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502442 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerName="extract-content" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502451 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" containerName="extract-content" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502460 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" containerName="extract-content" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502477 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502484 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502494 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502501 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502514 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fab354-a742-4d49-88d9-22843a857ea5" containerName="extract-content" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502522 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fab354-a742-4d49-88d9-22843a857ea5" containerName="extract-content" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.502531 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fab354-a742-4d49-88d9-22843a857ea5" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502538 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fab354-a742-4d49-88d9-22843a857ea5" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502683 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7497adea-5f95-433f-b644-9aa3eae85937" containerName="pruner" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502701 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7006b68f-caf9-44a9-a6df-26e7b594b931" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502714 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="60fab354-a742-4d49-88d9-22843a857ea5" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502726 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="317b5076-0f62-45e5-9db0-8d03103c990e" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.502739 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="21393f97-49f1-4f27-a24c-93f88fe6596b" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.503306 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.503433 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8bchs"] Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.646398 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhl9h\" (UniqueName: \"kubernetes.io/projected/d0dbfc52-f4e9-462a-a253-2bb950c04e7b-kube-api-access-vhl9h\") pod \"marketplace-operator-79b997595-8bchs\" (UID: \"d0dbfc52-f4e9-462a-a253-2bb950c04e7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.646765 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0dbfc52-f4e9-462a-a253-2bb950c04e7b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8bchs\" (UID: \"d0dbfc52-f4e9-462a-a253-2bb950c04e7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.646806 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0dbfc52-f4e9-462a-a253-2bb950c04e7b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8bchs\" (UID: \"d0dbfc52-f4e9-462a-a253-2bb950c04e7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.747573 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhl9h\" (UniqueName: \"kubernetes.io/projected/d0dbfc52-f4e9-462a-a253-2bb950c04e7b-kube-api-access-vhl9h\") pod \"marketplace-operator-79b997595-8bchs\" (UID: \"d0dbfc52-f4e9-462a-a253-2bb950c04e7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.747623 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0dbfc52-f4e9-462a-a253-2bb950c04e7b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8bchs\" (UID: \"d0dbfc52-f4e9-462a-a253-2bb950c04e7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.747654 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0dbfc52-f4e9-462a-a253-2bb950c04e7b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8bchs\" (UID: \"d0dbfc52-f4e9-462a-a253-2bb950c04e7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.749364 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0dbfc52-f4e9-462a-a253-2bb950c04e7b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8bchs\" (UID: \"d0dbfc52-f4e9-462a-a253-2bb950c04e7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.756455 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0dbfc52-f4e9-462a-a253-2bb950c04e7b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8bchs\" (UID: \"d0dbfc52-f4e9-462a-a253-2bb950c04e7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.766778 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhl9h\" (UniqueName: \"kubernetes.io/projected/d0dbfc52-f4e9-462a-a253-2bb950c04e7b-kube-api-access-vhl9h\") pod \"marketplace-operator-79b997595-8bchs\" (UID: \"d0dbfc52-f4e9-462a-a253-2bb950c04e7b\") " pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.824789 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc is running failed: container process not found" containerID="32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.826125 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc is running failed: container process not found" containerID="32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.826872 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc is running failed: container process not found" containerID="32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:04:55 crc kubenswrapper[4732]: E0131 09:04:55.827071 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-d7ngt" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerName="registry-server" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.894441 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.922042 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:04:55 crc kubenswrapper[4732]: I0131 09:04:55.936783 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.027022 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.031582 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.050478 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-654wm\" (UniqueName: \"kubernetes.io/projected/320c2656-6f30-4922-835e-8c27a82800b1-kube-api-access-654wm\") pod \"320c2656-6f30-4922-835e-8c27a82800b1\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.050974 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-utilities\") pod \"320c2656-6f30-4922-835e-8c27a82800b1\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.051020 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27rqc\" (UniqueName: \"kubernetes.io/projected/111ca852-fddd-4fb1-8d5d-331fd5921a71-kube-api-access-27rqc\") pod \"111ca852-fddd-4fb1-8d5d-331fd5921a71\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.051136 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-utilities\") pod \"111ca852-fddd-4fb1-8d5d-331fd5921a71\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.051299 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-catalog-content\") pod \"111ca852-fddd-4fb1-8d5d-331fd5921a71\" (UID: \"111ca852-fddd-4fb1-8d5d-331fd5921a71\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.051339 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-catalog-content\") pod \"320c2656-6f30-4922-835e-8c27a82800b1\" (UID: \"320c2656-6f30-4922-835e-8c27a82800b1\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.058142 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-utilities" (OuterVolumeSpecName: "utilities") pod "111ca852-fddd-4fb1-8d5d-331fd5921a71" (UID: "111ca852-fddd-4fb1-8d5d-331fd5921a71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.063427 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-utilities" (OuterVolumeSpecName: "utilities") pod "320c2656-6f30-4922-835e-8c27a82800b1" (UID: "320c2656-6f30-4922-835e-8c27a82800b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.068049 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.071324 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320c2656-6f30-4922-835e-8c27a82800b1-kube-api-access-654wm" (OuterVolumeSpecName: "kube-api-access-654wm") pod "320c2656-6f30-4922-835e-8c27a82800b1" (UID: "320c2656-6f30-4922-835e-8c27a82800b1"). InnerVolumeSpecName "kube-api-access-654wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.081856 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111ca852-fddd-4fb1-8d5d-331fd5921a71-kube-api-access-27rqc" (OuterVolumeSpecName: "kube-api-access-27rqc") pod "111ca852-fddd-4fb1-8d5d-331fd5921a71" (UID: "111ca852-fddd-4fb1-8d5d-331fd5921a71"). InnerVolumeSpecName "kube-api-access-27rqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.138902 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "111ca852-fddd-4fb1-8d5d-331fd5921a71" (UID: "111ca852-fddd-4fb1-8d5d-331fd5921a71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155280 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-utilities\") pod \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155425 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-trusted-ca\") pod \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155450 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67nvg\" (UniqueName: \"kubernetes.io/projected/f4d0ed50-aa9b-4a62-b340-882ddf73f008-kube-api-access-67nvg\") pod \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155467 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dzc9\" (UniqueName: \"kubernetes.io/projected/b03cae03-72c1-4b13-8031-33381e6df48a-kube-api-access-2dzc9\") pod \"b03cae03-72c1-4b13-8031-33381e6df48a\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155559 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-operator-metrics\") pod \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\" (UID: \"f4d0ed50-aa9b-4a62-b340-882ddf73f008\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155585 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-utilities\") pod \"b03cae03-72c1-4b13-8031-33381e6df48a\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155602 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-catalog-content\") pod \"b03cae03-72c1-4b13-8031-33381e6df48a\" (UID: \"b03cae03-72c1-4b13-8031-33381e6df48a\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155651 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88rk8\" (UniqueName: \"kubernetes.io/projected/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-kube-api-access-88rk8\") pod \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155684 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-catalog-content\") pod \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\" (UID: \"3d6ffd83-fb99-48e0-a34a-fd365f971ef1\") " Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155885 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155898 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/111ca852-fddd-4fb1-8d5d-331fd5921a71-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155909 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-654wm\" (UniqueName: \"kubernetes.io/projected/320c2656-6f30-4922-835e-8c27a82800b1-kube-api-access-654wm\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155918 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.155926 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27rqc\" (UniqueName: \"kubernetes.io/projected/111ca852-fddd-4fb1-8d5d-331fd5921a71-kube-api-access-27rqc\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.156226 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-utilities" (OuterVolumeSpecName: "utilities") pod "3d6ffd83-fb99-48e0-a34a-fd365f971ef1" (UID: "3d6ffd83-fb99-48e0-a34a-fd365f971ef1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.156315 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f4d0ed50-aa9b-4a62-b340-882ddf73f008" (UID: "f4d0ed50-aa9b-4a62-b340-882ddf73f008"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.156924 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-utilities" (OuterVolumeSpecName: "utilities") pod "b03cae03-72c1-4b13-8031-33381e6df48a" (UID: "b03cae03-72c1-4b13-8031-33381e6df48a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.159765 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03cae03-72c1-4b13-8031-33381e6df48a-kube-api-access-2dzc9" (OuterVolumeSpecName: "kube-api-access-2dzc9") pod "b03cae03-72c1-4b13-8031-33381e6df48a" (UID: "b03cae03-72c1-4b13-8031-33381e6df48a"). InnerVolumeSpecName "kube-api-access-2dzc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.161544 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-kube-api-access-88rk8" (OuterVolumeSpecName: "kube-api-access-88rk8") pod "3d6ffd83-fb99-48e0-a34a-fd365f971ef1" (UID: "3d6ffd83-fb99-48e0-a34a-fd365f971ef1"). InnerVolumeSpecName "kube-api-access-88rk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.161968 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f4d0ed50-aa9b-4a62-b340-882ddf73f008" (UID: "f4d0ed50-aa9b-4a62-b340-882ddf73f008"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.165415 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d0ed50-aa9b-4a62-b340-882ddf73f008-kube-api-access-67nvg" (OuterVolumeSpecName: "kube-api-access-67nvg") pod "f4d0ed50-aa9b-4a62-b340-882ddf73f008" (UID: "f4d0ed50-aa9b-4a62-b340-882ddf73f008"). InnerVolumeSpecName "kube-api-access-67nvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.186089 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d6ffd83-fb99-48e0-a34a-fd365f971ef1" (UID: "3d6ffd83-fb99-48e0-a34a-fd365f971ef1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.258051 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.258103 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.258116 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.258127 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88rk8\" (UniqueName: \"kubernetes.io/projected/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-kube-api-access-88rk8\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.258136 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d6ffd83-fb99-48e0-a34a-fd365f971ef1-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.258150 4732 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f4d0ed50-aa9b-4a62-b340-882ddf73f008-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.258164 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67nvg\" (UniqueName: \"kubernetes.io/projected/f4d0ed50-aa9b-4a62-b340-882ddf73f008-kube-api-access-67nvg\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.258176 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dzc9\" (UniqueName: \"kubernetes.io/projected/b03cae03-72c1-4b13-8031-33381e6df48a-kube-api-access-2dzc9\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.307206 4732 generic.go:334] "Generic (PLEG): container finished" podID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerID="ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a" exitCode=0 Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.307299 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb54f" event={"ID":"111ca852-fddd-4fb1-8d5d-331fd5921a71","Type":"ContainerDied","Data":"ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.307332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gb54f" event={"ID":"111ca852-fddd-4fb1-8d5d-331fd5921a71","Type":"ContainerDied","Data":"2852e87c3d90e55a145bff8290e54ac7389fb8f75ebcdc35c688bd52463d5985"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.307358 4732 scope.go:117] "RemoveContainer" containerID="ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.307503 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gb54f" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.310630 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "320c2656-6f30-4922-835e-8c27a82800b1" (UID: "320c2656-6f30-4922-835e-8c27a82800b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.315069 4732 generic.go:334] "Generic (PLEG): container finished" podID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerID="32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc" exitCode=0 Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.315168 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7ngt" event={"ID":"3d6ffd83-fb99-48e0-a34a-fd365f971ef1","Type":"ContainerDied","Data":"32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.315117 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7ngt" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.315207 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7ngt" event={"ID":"3d6ffd83-fb99-48e0-a34a-fd365f971ef1","Type":"ContainerDied","Data":"628f0999201f984caa436318589dd8628cafb6a41db0673c62163c5cc780a5ff"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.319591 4732 generic.go:334] "Generic (PLEG): container finished" podID="b03cae03-72c1-4b13-8031-33381e6df48a" containerID="3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6" exitCode=0 Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.319679 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdcdv" event={"ID":"b03cae03-72c1-4b13-8031-33381e6df48a","Type":"ContainerDied","Data":"3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.319701 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdcdv" event={"ID":"b03cae03-72c1-4b13-8031-33381e6df48a","Type":"ContainerDied","Data":"b2573936f6c20f59b9e6d6e1b605a68abfd082c81e607c5222d95cd6ee9a3427"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.319703 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdcdv" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.322137 4732 generic.go:334] "Generic (PLEG): container finished" podID="320c2656-6f30-4922-835e-8c27a82800b1" containerID="c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d" exitCode=0 Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.322207 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtg8l" event={"ID":"320c2656-6f30-4922-835e-8c27a82800b1","Type":"ContainerDied","Data":"c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.322237 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtg8l" event={"ID":"320c2656-6f30-4922-835e-8c27a82800b1","Type":"ContainerDied","Data":"7ba9bda66cde21334a1fb904223442bb2f107c035dd197b8f6160f7ac322e79d"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.322299 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtg8l" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.327451 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4d0ed50-aa9b-4a62-b340-882ddf73f008" containerID="393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183" exitCode=0 Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.327483 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" event={"ID":"f4d0ed50-aa9b-4a62-b340-882ddf73f008","Type":"ContainerDied","Data":"393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.327502 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" event={"ID":"f4d0ed50-aa9b-4a62-b340-882ddf73f008","Type":"ContainerDied","Data":"94900506e78a6c467e226df2c51bd0deeed7b70de0ce789b914b8672f3e90a32"} Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.327592 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ljds4" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.328209 4732 scope.go:117] "RemoveContainer" containerID="3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.359089 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/320c2656-6f30-4922-835e-8c27a82800b1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.359436 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7ngt"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.361846 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7ngt"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.373882 4732 scope.go:117] "RemoveContainer" containerID="b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.374807 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rtg8l"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.378328 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rtg8l"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.392655 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b03cae03-72c1-4b13-8031-33381e6df48a" (UID: "b03cae03-72c1-4b13-8031-33381e6df48a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.394029 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gb54f"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.397538 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gb54f"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.399980 4732 scope.go:117] "RemoveContainer" containerID="ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.400436 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a\": container with ID starting with ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a not found: ID does not exist" containerID="ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.400476 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a"} err="failed to get container status \"ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a\": rpc error: code = NotFound desc = could not find container \"ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a\": container with ID starting with ea8e81d27f0fb9fac5964e929f2d106d5646b1bf0175c1c60abc1799d1ad0f4a not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.400529 4732 scope.go:117] "RemoveContainer" containerID="3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.402242 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd\": container with ID starting with 3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd not found: ID does not exist" containerID="3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.402308 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd"} err="failed to get container status \"3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd\": rpc error: code = NotFound desc = could not find container \"3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd\": container with ID starting with 3a1d4a96a5189510003839cb3107e60917aaf8cd548467e8cbeac4847c512fbd not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.402346 4732 scope.go:117] "RemoveContainer" containerID="b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.404145 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb\": container with ID starting with b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb not found: ID does not exist" containerID="b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.404181 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb"} err="failed to get container status \"b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb\": rpc error: code = NotFound desc = could not find container \"b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb\": container with ID starting with b12c70aaf2deee35af9c59e7cdfeb63c5438b4e4668e33b49b029529e03925eb not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.404211 4732 scope.go:117] "RemoveContainer" containerID="32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.408746 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ljds4"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.411275 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ljds4"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.419557 4732 scope.go:117] "RemoveContainer" containerID="ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.432498 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8bchs"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.438680 4732 scope.go:117] "RemoveContainer" containerID="c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1" Jan 31 09:04:56 crc kubenswrapper[4732]: W0131 09:04:56.445321 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0dbfc52_f4e9_462a_a253_2bb950c04e7b.slice/crio-a38616795b18c0402ddac1b409f6ae8b51cd664284f98e405e9a18d7c40ad6fa WatchSource:0}: Error finding container a38616795b18c0402ddac1b409f6ae8b51cd664284f98e405e9a18d7c40ad6fa: Status 404 returned error can't find the container with id a38616795b18c0402ddac1b409f6ae8b51cd664284f98e405e9a18d7c40ad6fa Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.459895 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03cae03-72c1-4b13-8031-33381e6df48a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.462053 4732 scope.go:117] "RemoveContainer" containerID="32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.462401 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc\": container with ID starting with 32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc not found: ID does not exist" containerID="32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.462430 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc"} err="failed to get container status \"32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc\": rpc error: code = NotFound desc = could not find container \"32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc\": container with ID starting with 32ebfc35a37c1e9b6779e2a2b5a09f537a17be1ab00055bd8fd9c0a3bc8002bc not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.462450 4732 scope.go:117] "RemoveContainer" containerID="ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.463028 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b\": container with ID starting with ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b not found: ID does not exist" containerID="ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.463089 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b"} err="failed to get container status \"ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b\": rpc error: code = NotFound desc = could not find container \"ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b\": container with ID starting with ca7d5f833a7fd7f21418e6fb1ccdd79977cfae417daaca1c00afcd2310c6650b not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.463119 4732 scope.go:117] "RemoveContainer" containerID="c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.463384 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1\": container with ID starting with c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1 not found: ID does not exist" containerID="c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.463413 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1"} err="failed to get container status \"c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1\": rpc error: code = NotFound desc = could not find container \"c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1\": container with ID starting with c749f49cfbc861f45b12320940264abaaf152474c82e669652462baab6c5bea1 not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.463427 4732 scope.go:117] "RemoveContainer" containerID="3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.478843 4732 scope.go:117] "RemoveContainer" containerID="cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.494928 4732 scope.go:117] "RemoveContainer" containerID="4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.529313 4732 scope.go:117] "RemoveContainer" containerID="3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.529793 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6\": container with ID starting with 3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6 not found: ID does not exist" containerID="3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.529833 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6"} err="failed to get container status \"3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6\": rpc error: code = NotFound desc = could not find container \"3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6\": container with ID starting with 3f07a87f998adaa1b0b96893986a963301b7919b28a28c13a83cb15126a9cbc6 not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.529867 4732 scope.go:117] "RemoveContainer" containerID="cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.530133 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123\": container with ID starting with cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123 not found: ID does not exist" containerID="cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.530175 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123"} err="failed to get container status \"cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123\": rpc error: code = NotFound desc = could not find container \"cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123\": container with ID starting with cb6fa0e2f74568ca7be1a8e4ce25991aeb89ab0933a72f514639c1cb8e80b123 not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.530198 4732 scope.go:117] "RemoveContainer" containerID="4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.530629 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1\": container with ID starting with 4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1 not found: ID does not exist" containerID="4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.530654 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1"} err="failed to get container status \"4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1\": rpc error: code = NotFound desc = could not find container \"4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1\": container with ID starting with 4468cd510de754308c18a9a7af365ee982a0fdce8768e86c8b2291d7c729cfc1 not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.530707 4732 scope.go:117] "RemoveContainer" containerID="c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.549965 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" path="/var/lib/kubelet/pods/111ca852-fddd-4fb1-8d5d-331fd5921a71/volumes" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.551171 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320c2656-6f30-4922-835e-8c27a82800b1" path="/var/lib/kubelet/pods/320c2656-6f30-4922-835e-8c27a82800b1/volumes" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.552180 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" path="/var/lib/kubelet/pods/3d6ffd83-fb99-48e0-a34a-fd365f971ef1/volumes" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.554061 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d0ed50-aa9b-4a62-b340-882ddf73f008" path="/var/lib/kubelet/pods/f4d0ed50-aa9b-4a62-b340-882ddf73f008/volumes" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.556918 4732 scope.go:117] "RemoveContainer" containerID="7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.583034 4732 scope.go:117] "RemoveContainer" containerID="9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.659970 4732 scope.go:117] "RemoveContainer" containerID="c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.662604 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d\": container with ID starting with c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d not found: ID does not exist" containerID="c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.662648 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d"} err="failed to get container status \"c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d\": rpc error: code = NotFound desc = could not find container \"c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d\": container with ID starting with c9d4c116354d5c6376aaf4e43c684f7a010f6b0330ee1591e2d15233005a8f2d not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.662694 4732 scope.go:117] "RemoveContainer" containerID="7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.666100 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86\": container with ID starting with 7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86 not found: ID does not exist" containerID="7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.666147 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86"} err="failed to get container status \"7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86\": rpc error: code = NotFound desc = could not find container \"7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86\": container with ID starting with 7300efb33be6a4fcedc56ac9376d4a558057b335ad7ac3766bbdc2dea4d36c86 not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.666180 4732 scope.go:117] "RemoveContainer" containerID="9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.666456 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670\": container with ID starting with 9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670 not found: ID does not exist" containerID="9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.666483 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670"} err="failed to get container status \"9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670\": rpc error: code = NotFound desc = could not find container \"9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670\": container with ID starting with 9a69e31450d3f2f57f031d90e49b124d53417614f5829c208f10b5b95ffda670 not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.666499 4732 scope.go:117] "RemoveContainer" containerID="393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.681643 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vdcdv"] Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.683596 4732 scope.go:117] "RemoveContainer" containerID="393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183" Jan 31 09:04:56 crc kubenswrapper[4732]: E0131 09:04:56.686865 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183\": container with ID starting with 393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183 not found: ID does not exist" containerID="393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.686906 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183"} err="failed to get container status \"393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183\": rpc error: code = NotFound desc = could not find container \"393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183\": container with ID starting with 393baf9619f9cf20095d6f558e1acd40e97479a63b2c71e28acd340a9b8c0183 not found: ID does not exist" Jan 31 09:04:56 crc kubenswrapper[4732]: I0131 09:04:56.689485 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vdcdv"] Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.334727 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" event={"ID":"d0dbfc52-f4e9-462a-a253-2bb950c04e7b","Type":"ContainerStarted","Data":"d4b70b882cd4306a7b243faa137df216be0949001c889fd2898ec0ccea609c0a"} Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.335983 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.336095 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" event={"ID":"d0dbfc52-f4e9-462a-a253-2bb950c04e7b","Type":"ContainerStarted","Data":"a38616795b18c0402ddac1b409f6ae8b51cd664284f98e405e9a18d7c40ad6fa"} Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.341767 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.355171 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8bchs" podStartSLOduration=2.355145179 podStartE2EDuration="2.355145179s" podCreationTimestamp="2026-01-31 09:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:04:57.350298365 +0000 UTC m=+235.656174569" watchObservedRunningTime="2026-01-31 09:04:57.355145179 +0000 UTC m=+235.661021383" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593238 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krjtb"] Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593496 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" containerName="extract-content" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593510 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" containerName="extract-content" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593521 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593527 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593535 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320c2656-6f30-4922-835e-8c27a82800b1" containerName="extract-utilities" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593541 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="320c2656-6f30-4922-835e-8c27a82800b1" containerName="extract-utilities" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593551 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerName="extract-utilities" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593556 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerName="extract-utilities" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593565 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320c2656-6f30-4922-835e-8c27a82800b1" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593571 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="320c2656-6f30-4922-835e-8c27a82800b1" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593583 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320c2656-6f30-4922-835e-8c27a82800b1" containerName="extract-content" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593591 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="320c2656-6f30-4922-835e-8c27a82800b1" containerName="extract-content" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593601 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerName="extract-utilities" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593608 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerName="extract-utilities" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593620 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerName="extract-content" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593627 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerName="extract-content" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593639 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d0ed50-aa9b-4a62-b340-882ddf73f008" containerName="marketplace-operator" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593647 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d0ed50-aa9b-4a62-b340-882ddf73f008" containerName="marketplace-operator" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593655 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593679 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593686 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" containerName="extract-utilities" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593692 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" containerName="extract-utilities" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593700 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593706 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: E0131 09:04:57.593714 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerName="extract-content" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593720 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerName="extract-content" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593817 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="111ca852-fddd-4fb1-8d5d-331fd5921a71" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593830 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="320c2656-6f30-4922-835e-8c27a82800b1" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593850 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6ffd83-fb99-48e0-a34a-fd365f971ef1" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593860 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" containerName="registry-server" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.593870 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d0ed50-aa9b-4a62-b340-882ddf73f008" containerName="marketplace-operator" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.594750 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.598131 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.607128 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krjtb"] Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.674602 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e07aee-c4b1-4011-8442-c6dcfc4f415c-utilities\") pod \"redhat-marketplace-krjtb\" (UID: \"17e07aee-c4b1-4011-8442-c6dcfc4f415c\") " pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.674674 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e07aee-c4b1-4011-8442-c6dcfc4f415c-catalog-content\") pod \"redhat-marketplace-krjtb\" (UID: \"17e07aee-c4b1-4011-8442-c6dcfc4f415c\") " pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.674704 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9m8t\" (UniqueName: \"kubernetes.io/projected/17e07aee-c4b1-4011-8442-c6dcfc4f415c-kube-api-access-l9m8t\") pod \"redhat-marketplace-krjtb\" (UID: \"17e07aee-c4b1-4011-8442-c6dcfc4f415c\") " pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.776367 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e07aee-c4b1-4011-8442-c6dcfc4f415c-utilities\") pod \"redhat-marketplace-krjtb\" (UID: \"17e07aee-c4b1-4011-8442-c6dcfc4f415c\") " pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.776415 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e07aee-c4b1-4011-8442-c6dcfc4f415c-catalog-content\") pod \"redhat-marketplace-krjtb\" (UID: \"17e07aee-c4b1-4011-8442-c6dcfc4f415c\") " pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.776442 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9m8t\" (UniqueName: \"kubernetes.io/projected/17e07aee-c4b1-4011-8442-c6dcfc4f415c-kube-api-access-l9m8t\") pod \"redhat-marketplace-krjtb\" (UID: \"17e07aee-c4b1-4011-8442-c6dcfc4f415c\") " pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.776943 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e07aee-c4b1-4011-8442-c6dcfc4f415c-utilities\") pod \"redhat-marketplace-krjtb\" (UID: \"17e07aee-c4b1-4011-8442-c6dcfc4f415c\") " pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.777179 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e07aee-c4b1-4011-8442-c6dcfc4f415c-catalog-content\") pod \"redhat-marketplace-krjtb\" (UID: \"17e07aee-c4b1-4011-8442-c6dcfc4f415c\") " pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.796557 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9m8t\" (UniqueName: \"kubernetes.io/projected/17e07aee-c4b1-4011-8442-c6dcfc4f415c-kube-api-access-l9m8t\") pod \"redhat-marketplace-krjtb\" (UID: \"17e07aee-c4b1-4011-8442-c6dcfc4f415c\") " pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:57 crc kubenswrapper[4732]: I0131 09:04:57.907851 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:04:58 crc kubenswrapper[4732]: I0131 09:04:58.191490 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2h57x"] Jan 31 09:04:58 crc kubenswrapper[4732]: I0131 09:04:58.192598 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:58 crc kubenswrapper[4732]: I0131 09:04:58.194337 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 09:04:58 crc kubenswrapper[4732]: I0131 09:04:58.204835 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2h57x"] Jan 31 09:04:58 crc kubenswrapper[4732]: I0131 09:04:58.283426 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9039963e-96e4-4b4d-abdd-79f0429da944-catalog-content\") pod \"certified-operators-2h57x\" (UID: \"9039963e-96e4-4b4d-abdd-79f0429da944\") " pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:58 crc kubenswrapper[4732]: I0131 09:04:58.283673 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9039963e-96e4-4b4d-abdd-79f0429da944-utilities\") pod \"certified-operators-2h57x\" (UID: \"9039963e-96e4-4b4d-abdd-79f0429da944\") " pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:58 crc kubenswrapper[4732]: I0131 09:04:58.283757 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v4k9\" (UniqueName: \"kubernetes.io/projected/9039963e-96e4-4b4d-abdd-79f0429da944-kube-api-access-9v4k9\") pod \"certified-operators-2h57x\" (UID: \"9039963e-96e4-4b4d-abdd-79f0429da944\") " pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:58 crc kubenswrapper[4732]: I0131 09:04:58.317579 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krjtb"] Jan 31 09:04:58 crc kubenswrapper[4732]: W0131 09:04:58.322856 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17e07aee_c4b1_4011_8442_c6dcfc4f415c.slice/crio-43b9d1425fde7694bc4eec8c3cd8bed52a252a7fe324c00d8813428ba196e5ec WatchSource:0}: Error finding container 43b9d1425fde7694bc4eec8c3cd8bed52a252a7fe324c00d8813428ba196e5ec: Status 404 returned error can't find the container with id 43b9d1425fde7694bc4eec8c3cd8bed52a252a7fe324c00d8813428ba196e5ec Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.007448 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9039963e-96e4-4b4d-abdd-79f0429da944-catalog-content\") pod \"certified-operators-2h57x\" (UID: \"9039963e-96e4-4b4d-abdd-79f0429da944\") " pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.007544 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9039963e-96e4-4b4d-abdd-79f0429da944-utilities\") pod \"certified-operators-2h57x\" (UID: \"9039963e-96e4-4b4d-abdd-79f0429da944\") " pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.007597 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v4k9\" (UniqueName: \"kubernetes.io/projected/9039963e-96e4-4b4d-abdd-79f0429da944-kube-api-access-9v4k9\") pod \"certified-operators-2h57x\" (UID: \"9039963e-96e4-4b4d-abdd-79f0429da944\") " pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.008628 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9039963e-96e4-4b4d-abdd-79f0429da944-catalog-content\") pod \"certified-operators-2h57x\" (UID: \"9039963e-96e4-4b4d-abdd-79f0429da944\") " pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.008874 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9039963e-96e4-4b4d-abdd-79f0429da944-utilities\") pod \"certified-operators-2h57x\" (UID: \"9039963e-96e4-4b4d-abdd-79f0429da944\") " pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.017592 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03cae03-72c1-4b13-8031-33381e6df48a" path="/var/lib/kubelet/pods/b03cae03-72c1-4b13-8031-33381e6df48a/volumes" Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.018372 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krjtb" event={"ID":"17e07aee-c4b1-4011-8442-c6dcfc4f415c","Type":"ContainerStarted","Data":"43b9d1425fde7694bc4eec8c3cd8bed52a252a7fe324c00d8813428ba196e5ec"} Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.035515 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v4k9\" (UniqueName: \"kubernetes.io/projected/9039963e-96e4-4b4d-abdd-79f0429da944-kube-api-access-9v4k9\") pod \"certified-operators-2h57x\" (UID: \"9039963e-96e4-4b4d-abdd-79f0429da944\") " pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.111010 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.512256 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2h57x"] Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.990342 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4pkzq"] Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.991438 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:04:59 crc kubenswrapper[4732]: I0131 09:04:59.994003 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.001777 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4pkzq"] Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.044773 4732 generic.go:334] "Generic (PLEG): container finished" podID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" containerID="df94e93d1915945c57db95a4648474124da0667129113eae9e51acdd65857bc4" exitCode=0 Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.044873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krjtb" event={"ID":"17e07aee-c4b1-4011-8442-c6dcfc4f415c","Type":"ContainerDied","Data":"df94e93d1915945c57db95a4648474124da0667129113eae9e51acdd65857bc4"} Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.049418 4732 generic.go:334] "Generic (PLEG): container finished" podID="9039963e-96e4-4b4d-abdd-79f0429da944" containerID="f4798e8742e6ebd4b4d72427d15eb2585f404bd738e00648ac21d6ea1c06fa8c" exitCode=0 Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.049565 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2h57x" event={"ID":"9039963e-96e4-4b4d-abdd-79f0429da944","Type":"ContainerDied","Data":"f4798e8742e6ebd4b4d72427d15eb2585f404bd738e00648ac21d6ea1c06fa8c"} Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.049846 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2h57x" event={"ID":"9039963e-96e4-4b4d-abdd-79f0429da944","Type":"ContainerStarted","Data":"f6925f718ca946b7446cf11c11588d4363a885a606a6a0c54f739e78130ded85"} Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.131927 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sgb5\" (UniqueName: \"kubernetes.io/projected/a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45-kube-api-access-2sgb5\") pod \"redhat-operators-4pkzq\" (UID: \"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45\") " pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.132046 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45-catalog-content\") pod \"redhat-operators-4pkzq\" (UID: \"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45\") " pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.132178 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45-utilities\") pod \"redhat-operators-4pkzq\" (UID: \"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45\") " pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.233885 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45-catalog-content\") pod \"redhat-operators-4pkzq\" (UID: \"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45\") " pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.234313 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45-utilities\") pod \"redhat-operators-4pkzq\" (UID: \"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45\") " pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.234501 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sgb5\" (UniqueName: \"kubernetes.io/projected/a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45-kube-api-access-2sgb5\") pod \"redhat-operators-4pkzq\" (UID: \"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45\") " pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.234763 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45-utilities\") pod \"redhat-operators-4pkzq\" (UID: \"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45\") " pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.234434 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45-catalog-content\") pod \"redhat-operators-4pkzq\" (UID: \"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45\") " pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.272847 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sgb5\" (UniqueName: \"kubernetes.io/projected/a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45-kube-api-access-2sgb5\") pod \"redhat-operators-4pkzq\" (UID: \"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45\") " pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.310274 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.597808 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d7jxg"] Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.599749 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.600325 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d7jxg"] Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.603840 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.710230 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4pkzq"] Jan 31 09:05:00 crc kubenswrapper[4732]: W0131 09:05:00.724604 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39f958d_6d9b_4a4a_9ec9_cfb1f96b6f45.slice/crio-18f8e635fe93be4696f889e2fc45f7c0593a284c1d2aa8aad38fa3d009850a26 WatchSource:0}: Error finding container 18f8e635fe93be4696f889e2fc45f7c0593a284c1d2aa8aad38fa3d009850a26: Status 404 returned error can't find the container with id 18f8e635fe93be4696f889e2fc45f7c0593a284c1d2aa8aad38fa3d009850a26 Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.741920 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6pzr\" (UniqueName: \"kubernetes.io/projected/a7533049-a0d8-4488-bed6-2a9b28212061-kube-api-access-h6pzr\") pod \"community-operators-d7jxg\" (UID: \"a7533049-a0d8-4488-bed6-2a9b28212061\") " pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.742013 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7533049-a0d8-4488-bed6-2a9b28212061-catalog-content\") pod \"community-operators-d7jxg\" (UID: \"a7533049-a0d8-4488-bed6-2a9b28212061\") " pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.742078 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7533049-a0d8-4488-bed6-2a9b28212061-utilities\") pod \"community-operators-d7jxg\" (UID: \"a7533049-a0d8-4488-bed6-2a9b28212061\") " pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.843921 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7533049-a0d8-4488-bed6-2a9b28212061-catalog-content\") pod \"community-operators-d7jxg\" (UID: \"a7533049-a0d8-4488-bed6-2a9b28212061\") " pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.844019 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7533049-a0d8-4488-bed6-2a9b28212061-utilities\") pod \"community-operators-d7jxg\" (UID: \"a7533049-a0d8-4488-bed6-2a9b28212061\") " pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.844071 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6pzr\" (UniqueName: \"kubernetes.io/projected/a7533049-a0d8-4488-bed6-2a9b28212061-kube-api-access-h6pzr\") pod \"community-operators-d7jxg\" (UID: \"a7533049-a0d8-4488-bed6-2a9b28212061\") " pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.844429 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7533049-a0d8-4488-bed6-2a9b28212061-catalog-content\") pod \"community-operators-d7jxg\" (UID: \"a7533049-a0d8-4488-bed6-2a9b28212061\") " pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.844525 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7533049-a0d8-4488-bed6-2a9b28212061-utilities\") pod \"community-operators-d7jxg\" (UID: \"a7533049-a0d8-4488-bed6-2a9b28212061\") " pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.865988 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6pzr\" (UniqueName: \"kubernetes.io/projected/a7533049-a0d8-4488-bed6-2a9b28212061-kube-api-access-h6pzr\") pod \"community-operators-d7jxg\" (UID: \"a7533049-a0d8-4488-bed6-2a9b28212061\") " pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:00 crc kubenswrapper[4732]: I0131 09:05:00.921633 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:01 crc kubenswrapper[4732]: I0131 09:05:01.061297 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2h57x" event={"ID":"9039963e-96e4-4b4d-abdd-79f0429da944","Type":"ContainerStarted","Data":"0f4992419822b81cadf13d0c8b0224684037767274dfe5b0249022e62b9c4ebd"} Jan 31 09:05:01 crc kubenswrapper[4732]: I0131 09:05:01.068284 4732 generic.go:334] "Generic (PLEG): container finished" podID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" containerID="b5b10fba37c644587760b47c06070201a3597c2ce593265462cc4c0d593c0ca0" exitCode=0 Jan 31 09:05:01 crc kubenswrapper[4732]: I0131 09:05:01.068378 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4pkzq" event={"ID":"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45","Type":"ContainerDied","Data":"b5b10fba37c644587760b47c06070201a3597c2ce593265462cc4c0d593c0ca0"} Jan 31 09:05:01 crc kubenswrapper[4732]: I0131 09:05:01.068415 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4pkzq" event={"ID":"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45","Type":"ContainerStarted","Data":"18f8e635fe93be4696f889e2fc45f7c0593a284c1d2aa8aad38fa3d009850a26"} Jan 31 09:05:01 crc kubenswrapper[4732]: I0131 09:05:01.076995 4732 generic.go:334] "Generic (PLEG): container finished" podID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" containerID="3628d84d76141cff468153e0bfd020a893eb5565565cd42e084eaf8dd83b3ea1" exitCode=0 Jan 31 09:05:01 crc kubenswrapper[4732]: I0131 09:05:01.077043 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krjtb" event={"ID":"17e07aee-c4b1-4011-8442-c6dcfc4f415c","Type":"ContainerDied","Data":"3628d84d76141cff468153e0bfd020a893eb5565565cd42e084eaf8dd83b3ea1"} Jan 31 09:05:01 crc kubenswrapper[4732]: I0131 09:05:01.336619 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d7jxg"] Jan 31 09:05:01 crc kubenswrapper[4732]: W0131 09:05:01.351502 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7533049_a0d8_4488_bed6_2a9b28212061.slice/crio-4f1de4847787c76739eb84dcdae2234327fee04c61499def057ec8246fbf505c WatchSource:0}: Error finding container 4f1de4847787c76739eb84dcdae2234327fee04c61499def057ec8246fbf505c: Status 404 returned error can't find the container with id 4f1de4847787c76739eb84dcdae2234327fee04c61499def057ec8246fbf505c Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.084630 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4pkzq" event={"ID":"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45","Type":"ContainerStarted","Data":"79b1dd46333c264727adfb34d060a4bcca67f80ac41273dff417c69f0824a053"} Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.087010 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krjtb" event={"ID":"17e07aee-c4b1-4011-8442-c6dcfc4f415c","Type":"ContainerStarted","Data":"62dae642b0a846e49e35df8bd8ab6d5ea17e6f6b3c1acf1b5fa3cb86ea5bcd76"} Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.088300 4732 generic.go:334] "Generic (PLEG): container finished" podID="a7533049-a0d8-4488-bed6-2a9b28212061" containerID="caf869cbc215b2afbdd847a43f8f6c677e5acec31907d104d763b015257023a9" exitCode=0 Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.088352 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7jxg" event={"ID":"a7533049-a0d8-4488-bed6-2a9b28212061","Type":"ContainerDied","Data":"caf869cbc215b2afbdd847a43f8f6c677e5acec31907d104d763b015257023a9"} Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.088418 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7jxg" event={"ID":"a7533049-a0d8-4488-bed6-2a9b28212061","Type":"ContainerStarted","Data":"4f1de4847787c76739eb84dcdae2234327fee04c61499def057ec8246fbf505c"} Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.090918 4732 generic.go:334] "Generic (PLEG): container finished" podID="9039963e-96e4-4b4d-abdd-79f0429da944" containerID="0f4992419822b81cadf13d0c8b0224684037767274dfe5b0249022e62b9c4ebd" exitCode=0 Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.090961 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2h57x" event={"ID":"9039963e-96e4-4b4d-abdd-79f0429da944","Type":"ContainerDied","Data":"0f4992419822b81cadf13d0c8b0224684037767274dfe5b0249022e62b9c4ebd"} Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.130733 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krjtb" podStartSLOduration=3.535717187 podStartE2EDuration="5.13071423s" podCreationTimestamp="2026-01-31 09:04:57 +0000 UTC" firstStartedPulling="2026-01-31 09:05:00.048259304 +0000 UTC m=+238.354135518" lastFinishedPulling="2026-01-31 09:05:01.643256357 +0000 UTC m=+239.949132561" observedRunningTime="2026-01-31 09:05:02.126450826 +0000 UTC m=+240.432327040" watchObservedRunningTime="2026-01-31 09:05:02.13071423 +0000 UTC m=+240.436590434" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.615265 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.616023 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.616070 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.616420 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd" gracePeriod=15 Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.616457 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7" gracePeriod=15 Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.616490 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722" gracePeriod=15 Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.616567 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9" gracePeriod=15 Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.616490 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b" gracePeriod=15 Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617227 4732 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.617363 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617380 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.617392 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617399 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.617411 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617417 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.617428 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617436 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.617447 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617454 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.617463 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617469 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.617480 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617487 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617598 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617609 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617617 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617626 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617637 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617648 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.617815 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617825 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.617937 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.719048 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.231:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: E0131 09:05:02.747421 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.129.56.231:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-2h57x.188fc57928be0c23 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-2h57x,UID:9039963e-96e4-4b4d-abdd-79f0429da944,APIVersion:v1,ResourceVersion:29558,FieldPath:spec.containers{registry-server},},Reason:Created,Message:Created container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:05:02.745152547 +0000 UTC m=+241.051028751,LastTimestamp:2026-01-31 09:05:02.745152547 +0000 UTC m=+241.051028751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.783504 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.783561 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.783596 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.783626 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.784566 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.784634 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.784701 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.784833 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886447 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886521 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886541 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886561 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886591 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886650 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886693 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886649 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886724 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886610 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886726 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886754 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886780 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886832 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:02 crc kubenswrapper[4732]: I0131 09:05:02.886862 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.021182 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:03 crc kubenswrapper[4732]: W0131 09:05:03.042190 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-4448d0e6cbfe51dbafe95df2b0b0c2bf5f10a0cced86db8ec754e8e959a9424a WatchSource:0}: Error finding container 4448d0e6cbfe51dbafe95df2b0b0c2bf5f10a0cced86db8ec754e8e959a9424a: Status 404 returned error can't find the container with id 4448d0e6cbfe51dbafe95df2b0b0c2bf5f10a0cced86db8ec754e8e959a9424a Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.100147 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4448d0e6cbfe51dbafe95df2b0b0c2bf5f10a0cced86db8ec754e8e959a9424a"} Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.102691 4732 generic.go:334] "Generic (PLEG): container finished" podID="e90ec082-a189-4726-8049-2151ddf77961" containerID="23b1edb2efd137d80917cfc98d36d9f6d054406b735c02e22b781cbf0e7d7c9e" exitCode=0 Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.102747 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e90ec082-a189-4726-8049-2151ddf77961","Type":"ContainerDied","Data":"23b1edb2efd137d80917cfc98d36d9f6d054406b735c02e22b781cbf0e7d7c9e"} Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.103539 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.103763 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.105456 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2h57x" event={"ID":"9039963e-96e4-4b4d-abdd-79f0429da944","Type":"ContainerStarted","Data":"7e18dcd588ce02a5c46a844fa65ca0543996b00459817dc2f4b5a1bc8ce6068b"} Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.106421 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.106955 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.107220 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.108067 4732 generic.go:334] "Generic (PLEG): container finished" podID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" containerID="79b1dd46333c264727adfb34d060a4bcca67f80ac41273dff417c69f0824a053" exitCode=0 Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.108124 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4pkzq" event={"ID":"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45","Type":"ContainerDied","Data":"79b1dd46333c264727adfb34d060a4bcca67f80ac41273dff417c69f0824a053"} Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.108963 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.109241 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.109485 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.109732 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.112239 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.114347 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.116048 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b" exitCode=0 Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.116072 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9" exitCode=0 Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.116082 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722" exitCode=0 Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.116091 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7" exitCode=2 Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.116165 4732 scope.go:117] "RemoveContainer" containerID="c05af9bd69451273c25ddddf6b11cdd7b8f9b6e3e16d18f1c4a962321905696e" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.118259 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7jxg" event={"ID":"a7533049-a0d8-4488-bed6-2a9b28212061","Type":"ContainerStarted","Data":"711e2d4662cca5ec3b889c4e8addee987bb0a90f328fc5a8da0fd85d02eab978"} Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.118625 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.118990 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.119463 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.119812 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:03 crc kubenswrapper[4732]: I0131 09:05:03.120308 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.015989 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.016560 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.125590 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.127563 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3"} Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.128105 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: E0131 09:05:04.128130 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.231:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.128368 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.128681 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.128951 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.129188 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.129745 4732 generic.go:334] "Generic (PLEG): container finished" podID="a7533049-a0d8-4488-bed6-2a9b28212061" containerID="711e2d4662cca5ec3b889c4e8addee987bb0a90f328fc5a8da0fd85d02eab978" exitCode=0 Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.129823 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7jxg" event={"ID":"a7533049-a0d8-4488-bed6-2a9b28212061","Type":"ContainerDied","Data":"711e2d4662cca5ec3b889c4e8addee987bb0a90f328fc5a8da0fd85d02eab978"} Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.130240 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.130481 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.131468 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.131724 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.131987 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.132143 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4pkzq" event={"ID":"a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45","Type":"ContainerStarted","Data":"d9a3c754c27f8d01db64275a210cc130ed2e117a9a1d0dca50f48a1ad0c4749e"} Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.132820 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.133092 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.133912 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.134577 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.134855 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.393120 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.394394 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.394903 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.395402 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.395796 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.396054 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.505884 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-var-lock\") pod \"e90ec082-a189-4726-8049-2151ddf77961\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.506029 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-kubelet-dir\") pod \"e90ec082-a189-4726-8049-2151ddf77961\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.506060 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90ec082-a189-4726-8049-2151ddf77961-kube-api-access\") pod \"e90ec082-a189-4726-8049-2151ddf77961\" (UID: \"e90ec082-a189-4726-8049-2151ddf77961\") " Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.506019 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-var-lock" (OuterVolumeSpecName: "var-lock") pod "e90ec082-a189-4726-8049-2151ddf77961" (UID: "e90ec082-a189-4726-8049-2151ddf77961"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.506083 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e90ec082-a189-4726-8049-2151ddf77961" (UID: "e90ec082-a189-4726-8049-2151ddf77961"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.511710 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e90ec082-a189-4726-8049-2151ddf77961-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e90ec082-a189-4726-8049-2151ddf77961" (UID: "e90ec082-a189-4726-8049-2151ddf77961"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.607239 4732 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.607271 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e90ec082-a189-4726-8049-2151ddf77961-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.607284 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e90ec082-a189-4726-8049-2151ddf77961-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:04 crc kubenswrapper[4732]: E0131 09:05:04.625954 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: E0131 09:05:04.626460 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: E0131 09:05:04.626876 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: E0131 09:05:04.627088 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: E0131 09:05:04.627302 4732 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:04 crc kubenswrapper[4732]: I0131 09:05:04.627330 4732 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 31 09:05:04 crc kubenswrapper[4732]: E0131 09:05:04.627522 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="200ms" Jan 31 09:05:04 crc kubenswrapper[4732]: E0131 09:05:04.828362 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="400ms" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.027349 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.028327 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.028919 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.029435 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.029915 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.030169 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.030448 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.129314 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.129461 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.129473 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.129512 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.129547 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.129639 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.129951 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.129969 4732 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.129980 4732 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.143985 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"e90ec082-a189-4726-8049-2151ddf77961","Type":"ContainerDied","Data":"68aff39b648c97e8da52037b14095371b6089843216d06b0916075106acf4b04"} Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.144022 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68aff39b648c97e8da52037b14095371b6089843216d06b0916075106acf4b04" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.144087 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.147653 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.148031 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.148445 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.148691 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.148932 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.149108 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.149961 4732 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd" exitCode=0 Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.150072 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.150865 4732 scope.go:117] "RemoveContainer" containerID="c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.163131 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.163600 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.164057 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.164361 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.164656 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.164851 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d7jxg" event={"ID":"a7533049-a0d8-4488-bed6-2a9b28212061","Type":"ContainerStarted","Data":"ed84a78e28087f4c77e45e5411b14a6396a5280235d5cb0b510fa269633f7cd6"} Jan 31 09:05:05 crc kubenswrapper[4732]: E0131 09:05:05.165413 4732 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.231:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.166184 4732 scope.go:117] "RemoveContainer" containerID="83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.182948 4732 scope.go:117] "RemoveContainer" containerID="52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.196159 4732 scope.go:117] "RemoveContainer" containerID="89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.208583 4732 scope.go:117] "RemoveContainer" containerID="2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.227397 4732 scope.go:117] "RemoveContainer" containerID="31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f" Jan 31 09:05:05 crc kubenswrapper[4732]: E0131 09:05:05.229956 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="800ms" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.244074 4732 scope.go:117] "RemoveContainer" containerID="c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b" Jan 31 09:05:05 crc kubenswrapper[4732]: E0131 09:05:05.244495 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\": container with ID starting with c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b not found: ID does not exist" containerID="c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.244547 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b"} err="failed to get container status \"c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\": rpc error: code = NotFound desc = could not find container \"c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b\": container with ID starting with c276b6fdf2c47bfa38c4cf312bf6b893388c4291d306738fd6c31738cbd7483b not found: ID does not exist" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.244582 4732 scope.go:117] "RemoveContainer" containerID="83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9" Jan 31 09:05:05 crc kubenswrapper[4732]: E0131 09:05:05.244920 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\": container with ID starting with 83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9 not found: ID does not exist" containerID="83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.244966 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9"} err="failed to get container status \"83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\": rpc error: code = NotFound desc = could not find container \"83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9\": container with ID starting with 83c2048db4772c1a5f60b21340f773ab5c651045d4dac9edc15e91a19d15dce9 not found: ID does not exist" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.244993 4732 scope.go:117] "RemoveContainer" containerID="52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722" Jan 31 09:05:05 crc kubenswrapper[4732]: E0131 09:05:05.245240 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\": container with ID starting with 52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722 not found: ID does not exist" containerID="52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.245268 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722"} err="failed to get container status \"52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\": rpc error: code = NotFound desc = could not find container \"52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722\": container with ID starting with 52b8ba07937855cb1fd0e1db567f90e08848fca583b7174153918e2d6356d722 not found: ID does not exist" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.245288 4732 scope.go:117] "RemoveContainer" containerID="89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7" Jan 31 09:05:05 crc kubenswrapper[4732]: E0131 09:05:05.245471 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\": container with ID starting with 89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7 not found: ID does not exist" containerID="89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.245500 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7"} err="failed to get container status \"89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\": rpc error: code = NotFound desc = could not find container \"89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7\": container with ID starting with 89ce42a35726dd1fc6aea0eb2e5baa490de04566cd6d6298185260e896a8def7 not found: ID does not exist" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.245515 4732 scope.go:117] "RemoveContainer" containerID="2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd" Jan 31 09:05:05 crc kubenswrapper[4732]: E0131 09:05:05.245706 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\": container with ID starting with 2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd not found: ID does not exist" containerID="2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.245726 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd"} err="failed to get container status \"2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\": rpc error: code = NotFound desc = could not find container \"2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd\": container with ID starting with 2b5cbd45e357205f8bfc648396c0f684d6788343f687504feb45618d9082e5cd not found: ID does not exist" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.245741 4732 scope.go:117] "RemoveContainer" containerID="31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f" Jan 31 09:05:05 crc kubenswrapper[4732]: E0131 09:05:05.245939 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\": container with ID starting with 31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f not found: ID does not exist" containerID="31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f" Jan 31 09:05:05 crc kubenswrapper[4732]: I0131 09:05:05.245962 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f"} err="failed to get container status \"31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\": rpc error: code = NotFound desc = could not find container \"31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f\": container with ID starting with 31c705a62170aac7bfc2c3528b20d98f75257896646d26d74f7a6cafb5be7a2f not found: ID does not exist" Jan 31 09:05:06 crc kubenswrapper[4732]: E0131 09:05:06.030622 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="1.6s" Jan 31 09:05:06 crc kubenswrapper[4732]: I0131 09:05:06.169789 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:06 crc kubenswrapper[4732]: I0131 09:05:06.170125 4732 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:06 crc kubenswrapper[4732]: I0131 09:05:06.170475 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:06 crc kubenswrapper[4732]: I0131 09:05:06.170646 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:06 crc kubenswrapper[4732]: I0131 09:05:06.170823 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:06 crc kubenswrapper[4732]: I0131 09:05:06.549045 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 31 09:05:07 crc kubenswrapper[4732]: E0131 09:05:07.552317 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.129.56.231:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-2h57x.188fc57928be0c23 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-2h57x,UID:9039963e-96e4-4b4d-abdd-79f0429da944,APIVersion:v1,ResourceVersion:29558,FieldPath:spec.containers{registry-server},},Reason:Created,Message:Created container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:05:02.745152547 +0000 UTC m=+241.051028751,LastTimestamp:2026-01-31 09:05:02.745152547 +0000 UTC m=+241.051028751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:05:07 crc kubenswrapper[4732]: E0131 09:05:07.632089 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="3.2s" Jan 31 09:05:07 crc kubenswrapper[4732]: I0131 09:05:07.908048 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:05:07 crc kubenswrapper[4732]: I0131 09:05:07.908408 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:05:07 crc kubenswrapper[4732]: I0131 09:05:07.955315 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:05:07 crc kubenswrapper[4732]: I0131 09:05:07.955922 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4732]: I0131 09:05:07.956109 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4732]: I0131 09:05:07.956276 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4732]: I0131 09:05:07.956413 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:07 crc kubenswrapper[4732]: I0131 09:05:07.956543 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:08 crc kubenswrapper[4732]: I0131 09:05:08.248841 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krjtb" Jan 31 09:05:08 crc kubenswrapper[4732]: I0131 09:05:08.249701 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:08 crc kubenswrapper[4732]: I0131 09:05:08.250081 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:08 crc kubenswrapper[4732]: I0131 09:05:08.250383 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:08 crc kubenswrapper[4732]: I0131 09:05:08.250749 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:08 crc kubenswrapper[4732]: I0131 09:05:08.251133 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.111697 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.111755 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.177362 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.177918 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.178197 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.178453 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.178710 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.178986 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.225686 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2h57x" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.226242 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.226565 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.226813 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.227006 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:09 crc kubenswrapper[4732]: I0131 09:05:09.227199 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.311244 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.311590 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.351286 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.352107 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.352523 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.352844 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.353176 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.353463 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: E0131 09:05:10.832946 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="6.4s" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.922863 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.922924 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.969760 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.970427 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.971007 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.971473 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.971855 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:10 crc kubenswrapper[4732]: I0131 09:05:10.972127 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.231810 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d7jxg" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.233150 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.233836 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.234321 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.234545 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.234788 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.241135 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4pkzq" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.241736 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.242153 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.242710 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.242936 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:11 crc kubenswrapper[4732]: I0131 09:05:11.243182 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:12 crc kubenswrapper[4732]: I0131 09:05:12.549873 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:12 crc kubenswrapper[4732]: I0131 09:05:12.550662 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:12 crc kubenswrapper[4732]: I0131 09:05:12.550859 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:12 crc kubenswrapper[4732]: I0131 09:05:12.551055 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:12 crc kubenswrapper[4732]: I0131 09:05:12.551200 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:12 crc kubenswrapper[4732]: E0131 09:05:12.602523 4732 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.129.56.231:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" volumeName="registry-storage" Jan 31 09:05:14 crc kubenswrapper[4732]: I0131 09:05:14.541986 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:14 crc kubenswrapper[4732]: I0131 09:05:14.542795 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:14 crc kubenswrapper[4732]: I0131 09:05:14.543356 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:14 crc kubenswrapper[4732]: I0131 09:05:14.543600 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:14 crc kubenswrapper[4732]: I0131 09:05:14.543940 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:14 crc kubenswrapper[4732]: I0131 09:05:14.544282 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:14 crc kubenswrapper[4732]: I0131 09:05:14.554492 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:14 crc kubenswrapper[4732]: I0131 09:05:14.554707 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:14 crc kubenswrapper[4732]: E0131 09:05:14.555119 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:14 crc kubenswrapper[4732]: I0131 09:05:14.555547 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:14 crc kubenswrapper[4732]: W0131 09:05:14.577835 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-5f0d076b93198311aad7a58d038ed6fc168d5667dad1b2ef08747de7218ac717 WatchSource:0}: Error finding container 5f0d076b93198311aad7a58d038ed6fc168d5667dad1b2ef08747de7218ac717: Status 404 returned error can't find the container with id 5f0d076b93198311aad7a58d038ed6fc168d5667dad1b2ef08747de7218ac717 Jan 31 09:05:15 crc kubenswrapper[4732]: I0131 09:05:15.218099 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"82cac6b55aafc33105786700ec55dd147b36929c869fcd28d33fcea44ba044c2"} Jan 31 09:05:15 crc kubenswrapper[4732]: I0131 09:05:15.218385 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5f0d076b93198311aad7a58d038ed6fc168d5667dad1b2ef08747de7218ac717"} Jan 31 09:05:15 crc kubenswrapper[4732]: I0131 09:05:15.218655 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:15 crc kubenswrapper[4732]: I0131 09:05:15.218696 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:15 crc kubenswrapper[4732]: E0131 09:05:15.219101 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:15 crc kubenswrapper[4732]: I0131 09:05:15.219485 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: I0131 09:05:15.219893 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: I0131 09:05:15.220133 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: I0131 09:05:15.220546 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: I0131 09:05:15.220914 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: E0131 09:05:15.306075 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T09:05:15Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: E0131 09:05:15.306396 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: E0131 09:05:15.306819 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: E0131 09:05:15.307153 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: E0131 09:05:15.307365 4732 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:15 crc kubenswrapper[4732]: E0131 09:05:15.307384 4732 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.229980 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.230188 4732 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da" exitCode=1 Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.230253 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da"} Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.230741 4732 scope.go:117] "RemoveContainer" containerID="bd0e75b86e540472dde92c0b2c29d3af05e052a4e6a01fd03cd16cce663bd5da" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.231106 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.231458 4732 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.232142 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.232277 4732 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="82cac6b55aafc33105786700ec55dd147b36929c869fcd28d33fcea44ba044c2" exitCode=0 Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.232315 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"82cac6b55aafc33105786700ec55dd147b36929c869fcd28d33fcea44ba044c2"} Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.232631 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.232645 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.232639 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: E0131 09:05:17.233163 4732 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.233350 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: E0131 09:05:17.233376 4732 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.231:6443: connect: connection refused" interval="7s" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.233581 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.233976 4732 status_manager.go:851] "Failed to get status for pod" podUID="9039963e-96e4-4b4d-abdd-79f0429da944" pod="openshift-marketplace/certified-operators-2h57x" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-2h57x\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.234272 4732 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.234528 4732 status_manager.go:851] "Failed to get status for pod" podUID="17e07aee-c4b1-4011-8442-c6dcfc4f415c" pod="openshift-marketplace/redhat-marketplace-krjtb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-krjtb\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.234946 4732 status_manager.go:851] "Failed to get status for pod" podUID="a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45" pod="openshift-marketplace/redhat-operators-4pkzq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-4pkzq\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.235519 4732 status_manager.go:851] "Failed to get status for pod" podUID="a7533049-a0d8-4488-bed6-2a9b28212061" pod="openshift-marketplace/community-operators-d7jxg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-d7jxg\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: I0131 09:05:17.235792 4732 status_manager.go:851] "Failed to get status for pod" podUID="e90ec082-a189-4726-8049-2151ddf77961" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.231:6443: connect: connection refused" Jan 31 09:05:17 crc kubenswrapper[4732]: E0131 09:05:17.554515 4732 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.129.56.231:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-2h57x.188fc57928be0c23 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-2h57x,UID:9039963e-96e4-4b4d-abdd-79f0429da944,APIVersion:v1,ResourceVersion:29558,FieldPath:spec.containers{registry-server},},Reason:Created,Message:Created container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 09:05:02.745152547 +0000 UTC m=+241.051028751,LastTimestamp:2026-01-31 09:05:02.745152547 +0000 UTC m=+241.051028751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 09:05:18 crc kubenswrapper[4732]: I0131 09:05:18.242389 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 09:05:18 crc kubenswrapper[4732]: I0131 09:05:18.242754 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7fdf2f67b19cba3eae0f65f0c9c411a80b2a5af4c8acb388e2f13c8592f6ffc0"} Jan 31 09:05:18 crc kubenswrapper[4732]: I0131 09:05:18.248045 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bd21518371c5cfdf3ab91a35559abe0750cb66b8e231b7a6790545c4942f8f64"} Jan 31 09:05:18 crc kubenswrapper[4732]: I0131 09:05:18.248091 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"45b29ade00c36c5d91f59c065a1aedd808bbe04ca406dbafe4230c1ce34fe2e6"} Jan 31 09:05:18 crc kubenswrapper[4732]: I0131 09:05:18.248103 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"64cebec0077595628afd3c3e4f337629c538cd16f474cff3fb36cecda7b7bc19"} Jan 31 09:05:18 crc kubenswrapper[4732]: I0131 09:05:18.248113 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7147c0044f73e00e244e10e3177d5c1c98793fab74c39c66a8efce3ace5241d0"} Jan 31 09:05:19 crc kubenswrapper[4732]: I0131 09:05:19.256551 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3a045bd639964cacd4c1b25c1363df22d1f5be0da4646448a7cb3f70b8d25077"} Jan 31 09:05:19 crc kubenswrapper[4732]: I0131 09:05:19.256902 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:19 crc kubenswrapper[4732]: I0131 09:05:19.256808 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:19 crc kubenswrapper[4732]: I0131 09:05:19.256926 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:19 crc kubenswrapper[4732]: I0131 09:05:19.555815 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:19 crc kubenswrapper[4732]: I0131 09:05:19.555947 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:19 crc kubenswrapper[4732]: I0131 09:05:19.561940 4732 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]log ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]etcd ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/generic-apiserver-start-informers ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/priority-and-fairness-filter ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-apiextensions-informers ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-apiextensions-controllers ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/crd-informer-synced ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-system-namespaces-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 31 09:05:19 crc kubenswrapper[4732]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 31 09:05:19 crc kubenswrapper[4732]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/bootstrap-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/start-kube-aggregator-informers ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/apiservice-registration-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/apiservice-discovery-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]autoregister-completion ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/apiservice-openapi-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 31 09:05:19 crc kubenswrapper[4732]: livez check failed Jan 31 09:05:19 crc kubenswrapper[4732]: I0131 09:05:19.562013 4732 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.137190 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" podUID="c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" containerName="oauth-openshift" containerID="cri-o://9615c331134f8617d35092f6d2eb0dd4c5eead219bbc1b139774acc1bdb42b9b" gracePeriod=15 Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.263743 4732 generic.go:334] "Generic (PLEG): container finished" podID="c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" containerID="9615c331134f8617d35092f6d2eb0dd4c5eead219bbc1b139774acc1bdb42b9b" exitCode=0 Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.263830 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" event={"ID":"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20","Type":"ContainerDied","Data":"9615c331134f8617d35092f6d2eb0dd4c5eead219bbc1b139774acc1bdb42b9b"} Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.492940 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522270 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-policies\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522333 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-dir\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522392 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-login\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522416 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-session\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522446 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lsmw\" (UniqueName: \"kubernetes.io/projected/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-kube-api-access-9lsmw\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522460 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-idp-0-file-data\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522478 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-ocp-branding-template\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522515 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-router-certs\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522538 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-serving-cert\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522553 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-service-ca\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522574 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-cliconfig\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522594 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-provider-selection\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522623 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-error\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.522648 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-trusted-ca-bundle\") pod \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\" (UID: \"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20\") " Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.523860 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.523983 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.524953 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.526062 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.526229 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.529940 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.530269 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.530288 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-kube-api-access-9lsmw" (OuterVolumeSpecName: "kube-api-access-9lsmw") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "kube-api-access-9lsmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.530446 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.532740 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.533272 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.536887 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.537222 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.537620 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" (UID: "c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.623937 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.623994 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624011 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lsmw\" (UniqueName: \"kubernetes.io/projected/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-kube-api-access-9lsmw\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624022 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624037 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624049 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624084 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624097 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624108 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624120 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624131 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624161 4732 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624173 4732 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:20 crc kubenswrapper[4732]: I0131 09:05:20.624184 4732 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:21 crc kubenswrapper[4732]: I0131 09:05:21.273868 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" event={"ID":"c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20","Type":"ContainerDied","Data":"bf0aacb740607afdcd33e43432dcaec43c8aa3d7707aec7cab5cbf845309020a"} Jan 31 09:05:21 crc kubenswrapper[4732]: I0131 09:05:21.274288 4732 scope.go:117] "RemoveContainer" containerID="9615c331134f8617d35092f6d2eb0dd4c5eead219bbc1b139774acc1bdb42b9b" Jan 31 09:05:21 crc kubenswrapper[4732]: I0131 09:05:21.273995 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-c8t6l" Jan 31 09:05:22 crc kubenswrapper[4732]: I0131 09:05:22.151458 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:05:23 crc kubenswrapper[4732]: I0131 09:05:23.380403 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:05:23 crc kubenswrapper[4732]: I0131 09:05:23.386884 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:05:24 crc kubenswrapper[4732]: I0131 09:05:24.276068 4732 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:24 crc kubenswrapper[4732]: I0131 09:05:24.299810 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:24 crc kubenswrapper[4732]: I0131 09:05:24.299848 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:24 crc kubenswrapper[4732]: I0131 09:05:24.561324 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:24 crc kubenswrapper[4732]: I0131 09:05:24.563429 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7ba3697b-fd3c-4270-bb59-3408ba7ace54" Jan 31 09:05:25 crc kubenswrapper[4732]: I0131 09:05:25.305589 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:25 crc kubenswrapper[4732]: I0131 09:05:25.305837 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:25 crc kubenswrapper[4732]: I0131 09:05:25.310034 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:26 crc kubenswrapper[4732]: I0131 09:05:26.312464 4732 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:26 crc kubenswrapper[4732]: I0131 09:05:26.312507 4732 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d1dc686f-fa2d-4cd4-95d4-874df2dc3a1c" Jan 31 09:05:32 crc kubenswrapper[4732]: I0131 09:05:32.155547 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 09:05:32 crc kubenswrapper[4732]: I0131 09:05:32.569122 4732 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7ba3697b-fd3c-4270-bb59-3408ba7ace54" Jan 31 09:05:34 crc kubenswrapper[4732]: I0131 09:05:34.271072 4732 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 09:05:34 crc kubenswrapper[4732]: I0131 09:05:34.364311 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 09:05:34 crc kubenswrapper[4732]: I0131 09:05:34.540241 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 09:05:34 crc kubenswrapper[4732]: I0131 09:05:34.947386 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 09:05:34 crc kubenswrapper[4732]: I0131 09:05:34.957774 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.059528 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.315272 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.367175 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.445688 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.479059 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.515374 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.533291 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.702421 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.766380 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 09:05:35 crc kubenswrapper[4732]: I0131 09:05:35.861265 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.017189 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.089439 4732 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.090448 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4pkzq" podStartSLOduration=34.496990382 podStartE2EDuration="37.090417845s" podCreationTimestamp="2026-01-31 09:04:59 +0000 UTC" firstStartedPulling="2026-01-31 09:05:01.070175455 +0000 UTC m=+239.376051659" lastFinishedPulling="2026-01-31 09:05:03.663602918 +0000 UTC m=+241.969479122" observedRunningTime="2026-01-31 09:05:24.082444832 +0000 UTC m=+262.388321036" watchObservedRunningTime="2026-01-31 09:05:36.090417845 +0000 UTC m=+274.396294099" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.097813 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2h57x" podStartSLOduration=35.616753428 podStartE2EDuration="38.097786709s" podCreationTimestamp="2026-01-31 09:04:58 +0000 UTC" firstStartedPulling="2026-01-31 09:05:00.05140363 +0000 UTC m=+238.357279844" lastFinishedPulling="2026-01-31 09:05:02.532436921 +0000 UTC m=+240.838313125" observedRunningTime="2026-01-31 09:05:24.125639965 +0000 UTC m=+262.431516169" watchObservedRunningTime="2026-01-31 09:05:36.097786709 +0000 UTC m=+274.403663013" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.098710 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d7jxg" podStartSLOduration=33.274977029 podStartE2EDuration="36.098696449s" podCreationTimestamp="2026-01-31 09:05:00 +0000 UTC" firstStartedPulling="2026-01-31 09:05:02.089645295 +0000 UTC m=+240.395521499" lastFinishedPulling="2026-01-31 09:05:04.913364715 +0000 UTC m=+243.219240919" observedRunningTime="2026-01-31 09:05:24.095880977 +0000 UTC m=+262.401757181" watchObservedRunningTime="2026-01-31 09:05:36.098696449 +0000 UTC m=+274.404572703" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.100156 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-c8t6l"] Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.100259 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.105631 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.121897 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.121882888 podStartE2EDuration="12.121882888s" podCreationTimestamp="2026-01-31 09:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:05:36.119174179 +0000 UTC m=+274.425050393" watchObservedRunningTime="2026-01-31 09:05:36.121882888 +0000 UTC m=+274.427759092" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.130235 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.189733 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.209310 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.290107 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.387123 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.465339 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.551306 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" path="/var/lib/kubelet/pods/c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20/volumes" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.686406 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.710286 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.798771 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 09:05:36 crc kubenswrapper[4732]: I0131 09:05:36.966362 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.235892 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.266356 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.295956 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.328384 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.360064 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.388867 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.408720 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.646793 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.766431 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 09:05:37 crc kubenswrapper[4732]: I0131 09:05:37.823031 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.068069 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.081260 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.212558 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.306656 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.383578 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.432129 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.455992 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.466023 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.587454 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.767308 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.792427 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.878011 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.914377 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 09:05:38 crc kubenswrapper[4732]: I0131 09:05:38.957203 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.001126 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.086176 4732 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.097438 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.110642 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.380087 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.454273 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.488112 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.570560 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.686724 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.853196 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.855416 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 09:05:39 crc kubenswrapper[4732]: I0131 09:05:39.875350 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.084790 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.208176 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.253217 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.413622 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.424078 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.442109 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.542621 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.580608 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.668483 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.685235 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.758347 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.779455 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.799415 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.848042 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.881451 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.909482 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 09:05:40 crc kubenswrapper[4732]: I0131 09:05:40.936333 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.009329 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.016738 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.037353 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.103767 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.178331 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.212084 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.220956 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.344703 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.380108 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.381055 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.387290 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.492946 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.594168 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.626906 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.631798 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.713721 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.725278 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.849310 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.932171 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 09:05:41 crc kubenswrapper[4732]: I0131 09:05:41.996587 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.030632 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.063528 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.068679 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.131721 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.145117 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.185783 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.272986 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.310625 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.326651 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.338262 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.421954 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.515861 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.608529 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.684784 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.687375 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.698613 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.892313 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.907100 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.935784 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.954253 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.984882 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 09:05:42 crc kubenswrapper[4732]: I0131 09:05:42.992965 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.038871 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.087571 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.126075 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.145978 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.167823 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.175320 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.267773 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.352888 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.362161 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.416372 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.460203 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.583524 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.683442 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.768811 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 09:05:43 crc kubenswrapper[4732]: I0131 09:05:43.872039 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.003682 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.024967 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.050878 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.076160 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68974c876c-56m92"] Jan 31 09:05:44 crc kubenswrapper[4732]: E0131 09:05:44.076353 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90ec082-a189-4726-8049-2151ddf77961" containerName="installer" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.076363 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90ec082-a189-4726-8049-2151ddf77961" containerName="installer" Jan 31 09:05:44 crc kubenswrapper[4732]: E0131 09:05:44.076387 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" containerName="oauth-openshift" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.076393 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" containerName="oauth-openshift" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.076484 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90ec082-a189-4726-8049-2151ddf77961" containerName="installer" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.076494 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6c9fc07-9bb2-41d5-a08e-2ea5a441cd20" containerName="oauth-openshift" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.076863 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.078598 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.079136 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.079332 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.081146 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.083883 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.084087 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.084104 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.085127 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.085187 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.085605 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.085679 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.085695 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.086089 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.093588 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.106642 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.109866 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.113006 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-template-error\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.113187 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-template-login\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.113291 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.113404 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.113491 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkfzv\" (UniqueName: \"kubernetes.io/projected/7f741d28-9c76-4a05-8771-f8f448ee9a2a-kube-api-access-bkfzv\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.113603 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-audit-policies\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.113748 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.113846 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.113974 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.114076 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f741d28-9c76-4a05-8771-f8f448ee9a2a-audit-dir\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.114164 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-router-certs\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.114254 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-session\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.114355 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-service-ca\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.114450 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.160538 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.215556 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.216057 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.216176 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkfzv\" (UniqueName: \"kubernetes.io/projected/7f741d28-9c76-4a05-8771-f8f448ee9a2a-kube-api-access-bkfzv\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.216553 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217338 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-audit-policies\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217461 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217583 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217722 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f741d28-9c76-4a05-8771-f8f448ee9a2a-audit-dir\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217845 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-router-certs\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217964 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-session\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.218052 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-service-ca\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.218162 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217208 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217983 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217988 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7f741d28-9c76-4a05-8771-f8f448ee9a2a-audit-dir\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.217905 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-audit-policies\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.218585 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-template-error\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.218711 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-template-login\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.219064 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-service-ca\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.221779 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-router-certs\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.221782 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-session\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.222856 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.223581 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.223843 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.224984 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-template-error\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.225567 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-template-login\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.231885 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7f741d28-9c76-4a05-8771-f8f448ee9a2a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.239838 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkfzv\" (UniqueName: \"kubernetes.io/projected/7f741d28-9c76-4a05-8771-f8f448ee9a2a-kube-api-access-bkfzv\") pod \"oauth-openshift-68974c876c-56m92\" (UID: \"7f741d28-9c76-4a05-8771-f8f448ee9a2a\") " pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.263938 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.302479 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.303234 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.345933 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.395023 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.438273 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.466860 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.650438 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.710757 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.772521 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.827301 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 09:05:44 crc kubenswrapper[4732]: I0131 09:05:44.965084 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.038730 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.165911 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.185863 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.340061 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.357076 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.474783 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.508091 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.570830 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.633056 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.666604 4732 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.694144 4732 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.702678 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.715135 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.734910 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.783057 4732 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.786456 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.787550 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.793855 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.800458 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.936358 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 09:05:45 crc kubenswrapper[4732]: I0131 09:05:45.993997 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.137115 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.187068 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.209429 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.209688 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.335926 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.358517 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.363737 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.394065 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.410526 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.426769 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.618990 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.661757 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.687308 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.737639 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.759264 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.765082 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.778918 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.784621 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.795438 4732 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.795687 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3" gracePeriod=5 Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.881239 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 09:05:46 crc kubenswrapper[4732]: I0131 09:05:46.981423 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.022522 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.026569 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.105231 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.189911 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.191550 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.240084 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.275117 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.278694 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.479853 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.632137 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.669378 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.682840 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.698007 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.756941 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.785065 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.813651 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.842170 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.864863 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.866608 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 09:05:47 crc kubenswrapper[4732]: I0131 09:05:47.990436 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.100974 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.197924 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.333616 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.388247 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.446302 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.472818 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.475381 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.524735 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.638842 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 09:05:48 crc kubenswrapper[4732]: I0131 09:05:48.863145 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 09:05:49 crc kubenswrapper[4732]: I0131 09:05:49.083735 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 09:05:49 crc kubenswrapper[4732]: I0131 09:05:49.166767 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 09:05:49 crc kubenswrapper[4732]: I0131 09:05:49.235291 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 09:05:49 crc kubenswrapper[4732]: I0131 09:05:49.265404 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 09:05:49 crc kubenswrapper[4732]: I0131 09:05:49.356300 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 09:05:49 crc kubenswrapper[4732]: I0131 09:05:49.680050 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 09:05:49 crc kubenswrapper[4732]: I0131 09:05:49.912679 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 09:05:50 crc kubenswrapper[4732]: I0131 09:05:50.363551 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 09:05:50 crc kubenswrapper[4732]: I0131 09:05:50.758313 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 09:05:50 crc kubenswrapper[4732]: I0131 09:05:50.791898 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68974c876c-56m92"] Jan 31 09:05:50 crc kubenswrapper[4732]: I0131 09:05:50.848236 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 09:05:51 crc kubenswrapper[4732]: I0131 09:05:51.236131 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68974c876c-56m92"] Jan 31 09:05:51 crc kubenswrapper[4732]: I0131 09:05:51.458014 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68974c876c-56m92" event={"ID":"7f741d28-9c76-4a05-8771-f8f448ee9a2a","Type":"ContainerStarted","Data":"47b793847a3c4466d52fbc49cdd13780012ab2610d79b2abe326b626e2b5f9ff"} Jan 31 09:05:51 crc kubenswrapper[4732]: I0131 09:05:51.561528 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 09:05:51 crc kubenswrapper[4732]: I0131 09:05:51.584808 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 09:05:51 crc kubenswrapper[4732]: I0131 09:05:51.931545 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 09:05:51 crc kubenswrapper[4732]: I0131 09:05:51.931645 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.015556 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.015689 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.015705 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.015729 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.015826 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.015888 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.015933 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.015952 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.016072 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.016180 4732 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.016192 4732 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.016202 4732 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.016213 4732 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.023099 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.117791 4732 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.285433 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.466433 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.466502 4732 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3" exitCode=137 Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.466583 4732 scope.go:117] "RemoveContainer" containerID="de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.466624 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.468171 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68974c876c-56m92" event={"ID":"7f741d28-9c76-4a05-8771-f8f448ee9a2a","Type":"ContainerStarted","Data":"124175dc8a5a5e72202c80a37623e943ecd794550cce58db3a9ae7215e5bba55"} Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.468614 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.477601 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68974c876c-56m92" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.485930 4732 scope.go:117] "RemoveContainer" containerID="de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3" Jan 31 09:05:52 crc kubenswrapper[4732]: E0131 09:05:52.486259 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3\": container with ID starting with de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3 not found: ID does not exist" containerID="de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.486292 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3"} err="failed to get container status \"de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3\": rpc error: code = NotFound desc = could not find container \"de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3\": container with ID starting with de7e77b83a8e93a3b4873676b3b8bcdbab14d5437e495e34e69b839aa521fac3 not found: ID does not exist" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.504815 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68974c876c-56m92" podStartSLOduration=57.504795343 podStartE2EDuration="57.504795343s" podCreationTimestamp="2026-01-31 09:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:05:52.50140876 +0000 UTC m=+290.807284964" watchObservedRunningTime="2026-01-31 09:05:52.504795343 +0000 UTC m=+290.810671547" Jan 31 09:05:52 crc kubenswrapper[4732]: I0131 09:05:52.554215 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 31 09:06:02 crc kubenswrapper[4732]: I0131 09:06:02.271719 4732 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.109728 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tg4xc"] Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.111206 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" podUID="219a04b6-e7bd-4138-bcc7-4f650537aa24" containerName="controller-manager" containerID="cri-o://45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481" gracePeriod=30 Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.211997 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp"] Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.212520 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" podUID="541ea3c2-891c-4c3e-81fd-9d340112c62b" containerName="route-controller-manager" containerID="cri-o://e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274" gracePeriod=30 Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.465417 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.519956 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-client-ca\") pod \"219a04b6-e7bd-4138-bcc7-4f650537aa24\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.520027 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-config\") pod \"219a04b6-e7bd-4138-bcc7-4f650537aa24\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.520066 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bws5c\" (UniqueName: \"kubernetes.io/projected/219a04b6-e7bd-4138-bcc7-4f650537aa24-kube-api-access-bws5c\") pod \"219a04b6-e7bd-4138-bcc7-4f650537aa24\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.520100 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-proxy-ca-bundles\") pod \"219a04b6-e7bd-4138-bcc7-4f650537aa24\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.520121 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a04b6-e7bd-4138-bcc7-4f650537aa24-serving-cert\") pod \"219a04b6-e7bd-4138-bcc7-4f650537aa24\" (UID: \"219a04b6-e7bd-4138-bcc7-4f650537aa24\") " Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.521504 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-client-ca" (OuterVolumeSpecName: "client-ca") pod "219a04b6-e7bd-4138-bcc7-4f650537aa24" (UID: "219a04b6-e7bd-4138-bcc7-4f650537aa24"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.521904 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "219a04b6-e7bd-4138-bcc7-4f650537aa24" (UID: "219a04b6-e7bd-4138-bcc7-4f650537aa24"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.522082 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-config" (OuterVolumeSpecName: "config") pod "219a04b6-e7bd-4138-bcc7-4f650537aa24" (UID: "219a04b6-e7bd-4138-bcc7-4f650537aa24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.526296 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/219a04b6-e7bd-4138-bcc7-4f650537aa24-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "219a04b6-e7bd-4138-bcc7-4f650537aa24" (UID: "219a04b6-e7bd-4138-bcc7-4f650537aa24"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.526447 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219a04b6-e7bd-4138-bcc7-4f650537aa24-kube-api-access-bws5c" (OuterVolumeSpecName: "kube-api-access-bws5c") pod "219a04b6-e7bd-4138-bcc7-4f650537aa24" (UID: "219a04b6-e7bd-4138-bcc7-4f650537aa24"). InnerVolumeSpecName "kube-api-access-bws5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.540039 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.553101 4732 generic.go:334] "Generic (PLEG): container finished" podID="541ea3c2-891c-4c3e-81fd-9d340112c62b" containerID="e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274" exitCode=0 Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.553201 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" event={"ID":"541ea3c2-891c-4c3e-81fd-9d340112c62b","Type":"ContainerDied","Data":"e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274"} Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.553231 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" event={"ID":"541ea3c2-891c-4c3e-81fd-9d340112c62b","Type":"ContainerDied","Data":"2ecd69d7eb9bd214f9aff82913ddef7756830d15321e64e3fd947685beb0e5f0"} Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.553250 4732 scope.go:117] "RemoveContainer" containerID="e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.553339 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.563053 4732 generic.go:334] "Generic (PLEG): container finished" podID="219a04b6-e7bd-4138-bcc7-4f650537aa24" containerID="45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481" exitCode=0 Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.563106 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" event={"ID":"219a04b6-e7bd-4138-bcc7-4f650537aa24","Type":"ContainerDied","Data":"45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481"} Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.563137 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" event={"ID":"219a04b6-e7bd-4138-bcc7-4f650537aa24","Type":"ContainerDied","Data":"371f900003f95ceb406307c3e3064fc3164d972cb97c8e25812cb0e57334cb37"} Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.563194 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tg4xc" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.583147 4732 scope.go:117] "RemoveContainer" containerID="e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274" Jan 31 09:06:07 crc kubenswrapper[4732]: E0131 09:06:07.583974 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274\": container with ID starting with e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274 not found: ID does not exist" containerID="e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.584006 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274"} err="failed to get container status \"e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274\": rpc error: code = NotFound desc = could not find container \"e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274\": container with ID starting with e821c09a3ccb6a40c23e0f74bb9209a294da276e26d710a28decd86bc0d3c274 not found: ID does not exist" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.584023 4732 scope.go:117] "RemoveContainer" containerID="45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.593428 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tg4xc"] Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.602632 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tg4xc"] Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.611420 4732 scope.go:117] "RemoveContainer" containerID="45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481" Jan 31 09:06:07 crc kubenswrapper[4732]: E0131 09:06:07.611868 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481\": container with ID starting with 45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481 not found: ID does not exist" containerID="45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.611926 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481"} err="failed to get container status \"45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481\": rpc error: code = NotFound desc = could not find container \"45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481\": container with ID starting with 45d9723ae1f9063bad99b0cd5b3c8cb9365e80583327ef29b13ddfa585f2a481 not found: ID does not exist" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.621344 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-config\") pod \"541ea3c2-891c-4c3e-81fd-9d340112c62b\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.621400 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541ea3c2-891c-4c3e-81fd-9d340112c62b-serving-cert\") pod \"541ea3c2-891c-4c3e-81fd-9d340112c62b\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.621462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhhgt\" (UniqueName: \"kubernetes.io/projected/541ea3c2-891c-4c3e-81fd-9d340112c62b-kube-api-access-jhhgt\") pod \"541ea3c2-891c-4c3e-81fd-9d340112c62b\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.621495 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-client-ca\") pod \"541ea3c2-891c-4c3e-81fd-9d340112c62b\" (UID: \"541ea3c2-891c-4c3e-81fd-9d340112c62b\") " Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.621812 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.621825 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.621834 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bws5c\" (UniqueName: \"kubernetes.io/projected/219a04b6-e7bd-4138-bcc7-4f650537aa24-kube-api-access-bws5c\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.621846 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/219a04b6-e7bd-4138-bcc7-4f650537aa24-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.621855 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/219a04b6-e7bd-4138-bcc7-4f650537aa24-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.622814 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-client-ca" (OuterVolumeSpecName: "client-ca") pod "541ea3c2-891c-4c3e-81fd-9d340112c62b" (UID: "541ea3c2-891c-4c3e-81fd-9d340112c62b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.624364 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-config" (OuterVolumeSpecName: "config") pod "541ea3c2-891c-4c3e-81fd-9d340112c62b" (UID: "541ea3c2-891c-4c3e-81fd-9d340112c62b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.625435 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/541ea3c2-891c-4c3e-81fd-9d340112c62b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "541ea3c2-891c-4c3e-81fd-9d340112c62b" (UID: "541ea3c2-891c-4c3e-81fd-9d340112c62b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.627245 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/541ea3c2-891c-4c3e-81fd-9d340112c62b-kube-api-access-jhhgt" (OuterVolumeSpecName: "kube-api-access-jhhgt") pod "541ea3c2-891c-4c3e-81fd-9d340112c62b" (UID: "541ea3c2-891c-4c3e-81fd-9d340112c62b"). InnerVolumeSpecName "kube-api-access-jhhgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.723341 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541ea3c2-891c-4c3e-81fd-9d340112c62b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.723432 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhhgt\" (UniqueName: \"kubernetes.io/projected/541ea3c2-891c-4c3e-81fd-9d340112c62b-kube-api-access-jhhgt\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.723460 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.723479 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541ea3c2-891c-4c3e-81fd-9d340112c62b-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.900381 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp"] Jan 31 09:06:07 crc kubenswrapper[4732]: I0131 09:06:07.907464 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5s6dp"] Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.551709 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219a04b6-e7bd-4138-bcc7-4f650537aa24" path="/var/lib/kubelet/pods/219a04b6-e7bd-4138-bcc7-4f650537aa24/volumes" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.552415 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="541ea3c2-891c-4c3e-81fd-9d340112c62b" path="/var/lib/kubelet/pods/541ea3c2-891c-4c3e-81fd-9d340112c62b/volumes" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.635509 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k"] Jan 31 09:06:08 crc kubenswrapper[4732]: E0131 09:06:08.635953 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.635981 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 09:06:08 crc kubenswrapper[4732]: E0131 09:06:08.635994 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="541ea3c2-891c-4c3e-81fd-9d340112c62b" containerName="route-controller-manager" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.636005 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="541ea3c2-891c-4c3e-81fd-9d340112c62b" containerName="route-controller-manager" Jan 31 09:06:08 crc kubenswrapper[4732]: E0131 09:06:08.636017 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219a04b6-e7bd-4138-bcc7-4f650537aa24" containerName="controller-manager" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.636172 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="219a04b6-e7bd-4138-bcc7-4f650537aa24" containerName="controller-manager" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.636305 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.636328 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="219a04b6-e7bd-4138-bcc7-4f650537aa24" containerName="controller-manager" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.636342 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="541ea3c2-891c-4c3e-81fd-9d340112c62b" containerName="route-controller-manager" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.636920 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.640421 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.640881 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.641211 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.641627 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.641886 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.642918 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr"] Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.643316 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.643923 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.647060 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.647375 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.647826 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.648060 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.648142 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.648489 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.650600 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k"] Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.653133 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr"] Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.655909 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.754762 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-config\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.755120 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-config\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.755153 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-client-ca\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.755210 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-client-ca\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.755264 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de12e404-51e0-4b46-939c-3e4d4f9fbe13-serving-cert\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.755287 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxxvp\" (UniqueName: \"kubernetes.io/projected/de12e404-51e0-4b46-939c-3e4d4f9fbe13-kube-api-access-vxxvp\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.755335 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-serving-cert\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.755359 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnkkl\" (UniqueName: \"kubernetes.io/projected/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-kube-api-access-vnkkl\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.755442 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-proxy-ca-bundles\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.856375 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-proxy-ca-bundles\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.856481 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-config\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.856513 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-config\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.856541 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-client-ca\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.856561 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-client-ca\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.856587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxxvp\" (UniqueName: \"kubernetes.io/projected/de12e404-51e0-4b46-939c-3e4d4f9fbe13-kube-api-access-vxxvp\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.856609 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de12e404-51e0-4b46-939c-3e4d4f9fbe13-serving-cert\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.856640 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-serving-cert\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.856706 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnkkl\" (UniqueName: \"kubernetes.io/projected/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-kube-api-access-vnkkl\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.857486 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-client-ca\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.858082 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-client-ca\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.858195 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-config\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.858585 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-proxy-ca-bundles\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.858838 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-config\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.862679 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de12e404-51e0-4b46-939c-3e4d4f9fbe13-serving-cert\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.862752 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-serving-cert\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.872653 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxxvp\" (UniqueName: \"kubernetes.io/projected/de12e404-51e0-4b46-939c-3e4d4f9fbe13-kube-api-access-vxxvp\") pod \"controller-manager-5b9c9894ff-7ssvr\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.872872 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnkkl\" (UniqueName: \"kubernetes.io/projected/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-kube-api-access-vnkkl\") pod \"route-controller-manager-76f567b4cc-wx87k\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.968808 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:08 crc kubenswrapper[4732]: I0131 09:06:08.977836 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.220151 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr"] Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.267481 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k"] Jan 31 09:06:09 crc kubenswrapper[4732]: W0131 09:06:09.273159 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59cbb5c4_c743_47e1_8dc3_e4be5ddd3594.slice/crio-66a60772a896bec1c7562b8662934cf79b67703e7fa04d7571b157999e2e5ce1 WatchSource:0}: Error finding container 66a60772a896bec1c7562b8662934cf79b67703e7fa04d7571b157999e2e5ce1: Status 404 returned error can't find the container with id 66a60772a896bec1c7562b8662934cf79b67703e7fa04d7571b157999e2e5ce1 Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.580620 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" event={"ID":"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594","Type":"ContainerStarted","Data":"a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552"} Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.581904 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" event={"ID":"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594","Type":"ContainerStarted","Data":"66a60772a896bec1c7562b8662934cf79b67703e7fa04d7571b157999e2e5ce1"} Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.581990 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.582898 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" event={"ID":"de12e404-51e0-4b46-939c-3e4d4f9fbe13","Type":"ContainerStarted","Data":"37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056"} Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.582997 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" event={"ID":"de12e404-51e0-4b46-939c-3e4d4f9fbe13","Type":"ContainerStarted","Data":"26c9134abbeaeaee484b07f52184f5d710a6aba6bd8ddf53fc947ed96eeb5d71"} Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.583843 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.597150 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.615463 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" podStartSLOduration=2.615443365 podStartE2EDuration="2.615443365s" podCreationTimestamp="2026-01-31 09:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:09.613827401 +0000 UTC m=+307.919703615" watchObservedRunningTime="2026-01-31 09:06:09.615443365 +0000 UTC m=+307.921319569" Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.632699 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" podStartSLOduration=2.632683686 podStartE2EDuration="2.632683686s" podCreationTimestamp="2026-01-31 09:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:09.63067442 +0000 UTC m=+307.936550624" watchObservedRunningTime="2026-01-31 09:06:09.632683686 +0000 UTC m=+307.938559890" Jan 31 09:06:09 crc kubenswrapper[4732]: I0131 09:06:09.947972 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:11 crc kubenswrapper[4732]: I0131 09:06:11.584970 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr"] Jan 31 09:06:11 crc kubenswrapper[4732]: I0131 09:06:11.595039 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k"] Jan 31 09:06:12 crc kubenswrapper[4732]: I0131 09:06:12.611456 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" podUID="59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" containerName="route-controller-manager" containerID="cri-o://a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552" gracePeriod=30 Jan 31 09:06:12 crc kubenswrapper[4732]: I0131 09:06:12.612200 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" podUID="de12e404-51e0-4b46-939c-3e4d4f9fbe13" containerName="controller-manager" containerID="cri-o://37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056" gracePeriod=30 Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.029960 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.126047 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-client-ca\") pod \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.126187 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnkkl\" (UniqueName: \"kubernetes.io/projected/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-kube-api-access-vnkkl\") pod \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.126218 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-config\") pod \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.126241 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-serving-cert\") pod \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\" (UID: \"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594\") " Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.127604 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-client-ca" (OuterVolumeSpecName: "client-ca") pod "59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" (UID: "59cbb5c4-c743-47e1-8dc3-e4be5ddd3594"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.127824 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-config" (OuterVolumeSpecName: "config") pod "59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" (UID: "59cbb5c4-c743-47e1-8dc3-e4be5ddd3594"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.133704 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" (UID: "59cbb5c4-c743-47e1-8dc3-e4be5ddd3594"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.135862 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-kube-api-access-vnkkl" (OuterVolumeSpecName: "kube-api-access-vnkkl") pod "59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" (UID: "59cbb5c4-c743-47e1-8dc3-e4be5ddd3594"). InnerVolumeSpecName "kube-api-access-vnkkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.164387 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.227461 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de12e404-51e0-4b46-939c-3e4d4f9fbe13-serving-cert\") pod \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.227508 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxxvp\" (UniqueName: \"kubernetes.io/projected/de12e404-51e0-4b46-939c-3e4d4f9fbe13-kube-api-access-vxxvp\") pod \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.227534 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-client-ca\") pod \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.227573 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-config\") pod \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.227756 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-proxy-ca-bundles\") pod \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\" (UID: \"de12e404-51e0-4b46-939c-3e4d4f9fbe13\") " Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.228837 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnkkl\" (UniqueName: \"kubernetes.io/projected/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-kube-api-access-vnkkl\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.229080 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.229090 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.229102 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.228272 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-client-ca" (OuterVolumeSpecName: "client-ca") pod "de12e404-51e0-4b46-939c-3e4d4f9fbe13" (UID: "de12e404-51e0-4b46-939c-3e4d4f9fbe13"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.228299 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "de12e404-51e0-4b46-939c-3e4d4f9fbe13" (UID: "de12e404-51e0-4b46-939c-3e4d4f9fbe13"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.228564 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-config" (OuterVolumeSpecName: "config") pod "de12e404-51e0-4b46-939c-3e4d4f9fbe13" (UID: "de12e404-51e0-4b46-939c-3e4d4f9fbe13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.230713 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de12e404-51e0-4b46-939c-3e4d4f9fbe13-kube-api-access-vxxvp" (OuterVolumeSpecName: "kube-api-access-vxxvp") pod "de12e404-51e0-4b46-939c-3e4d4f9fbe13" (UID: "de12e404-51e0-4b46-939c-3e4d4f9fbe13"). InnerVolumeSpecName "kube-api-access-vxxvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.230931 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de12e404-51e0-4b46-939c-3e4d4f9fbe13-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "de12e404-51e0-4b46-939c-3e4d4f9fbe13" (UID: "de12e404-51e0-4b46-939c-3e4d4f9fbe13"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.330648 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.330703 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.330716 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/de12e404-51e0-4b46-939c-3e4d4f9fbe13-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.330727 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxxvp\" (UniqueName: \"kubernetes.io/projected/de12e404-51e0-4b46-939c-3e4d4f9fbe13-kube-api-access-vxxvp\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.330743 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/de12e404-51e0-4b46-939c-3e4d4f9fbe13-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.621637 4732 generic.go:334] "Generic (PLEG): container finished" podID="59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" containerID="a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552" exitCode=0 Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.621684 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.621706 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" event={"ID":"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594","Type":"ContainerDied","Data":"a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552"} Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.621956 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k" event={"ID":"59cbb5c4-c743-47e1-8dc3-e4be5ddd3594","Type":"ContainerDied","Data":"66a60772a896bec1c7562b8662934cf79b67703e7fa04d7571b157999e2e5ce1"} Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.622022 4732 scope.go:117] "RemoveContainer" containerID="a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.623957 4732 generic.go:334] "Generic (PLEG): container finished" podID="de12e404-51e0-4b46-939c-3e4d4f9fbe13" containerID="37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056" exitCode=0 Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.624003 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" event={"ID":"de12e404-51e0-4b46-939c-3e4d4f9fbe13","Type":"ContainerDied","Data":"37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056"} Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.624039 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" event={"ID":"de12e404-51e0-4b46-939c-3e4d4f9fbe13","Type":"ContainerDied","Data":"26c9134abbeaeaee484b07f52184f5d710a6aba6bd8ddf53fc947ed96eeb5d71"} Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.624887 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.648517 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd"] Jan 31 09:06:13 crc kubenswrapper[4732]: E0131 09:06:13.648950 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de12e404-51e0-4b46-939c-3e4d4f9fbe13" containerName="controller-manager" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.648973 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="de12e404-51e0-4b46-939c-3e4d4f9fbe13" containerName="controller-manager" Jan 31 09:06:13 crc kubenswrapper[4732]: E0131 09:06:13.648997 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" containerName="route-controller-manager" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.649007 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" containerName="route-controller-manager" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.649483 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="de12e404-51e0-4b46-939c-3e4d4f9fbe13" containerName="controller-manager" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.649509 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" containerName="route-controller-manager" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.649980 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.654633 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.654726 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.654914 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.654633 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.656812 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.657859 4732 scope.go:117] "RemoveContainer" containerID="a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.659339 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.660438 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn"] Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.666540 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: E0131 09:06:13.667594 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552\": container with ID starting with a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552 not found: ID does not exist" containerID="a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.667634 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552"} err="failed to get container status \"a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552\": rpc error: code = NotFound desc = could not find container \"a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552\": container with ID starting with a79fc2d373557ec62d345e067f38d8a9c1ffa9f0c16b680a9e29b25ab06e7552 not found: ID does not exist" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.667702 4732 scope.go:117] "RemoveContainer" containerID="37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.670759 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.670977 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.673015 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.673352 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.684576 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.701529 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.701822 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.703638 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd"] Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.705539 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn"] Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.718323 4732 scope.go:117] "RemoveContainer" containerID="37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056" Jan 31 09:06:13 crc kubenswrapper[4732]: E0131 09:06:13.718925 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056\": container with ID starting with 37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056 not found: ID does not exist" containerID="37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.719080 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056"} err="failed to get container status \"37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056\": rpc error: code = NotFound desc = could not find container \"37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056\": container with ID starting with 37d826fc8b67119e9529dfec023e0e9d5aa68c4387172bcc00d095d536059056 not found: ID does not exist" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.734991 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k"] Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.736798 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-client-ca\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.736860 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-client-ca\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.736881 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87d41f5c-8722-4115-b4b2-06493d6f18e2-serving-cert\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.736905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-config\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.736921 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz7s9\" (UniqueName: \"kubernetes.io/projected/07f12b30-c71f-4cf5-88b2-06c78ce8243a-kube-api-access-wz7s9\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.736946 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-proxy-ca-bundles\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.736968 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbm5f\" (UniqueName: \"kubernetes.io/projected/87d41f5c-8722-4115-b4b2-06493d6f18e2-kube-api-access-hbm5f\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.736985 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-config\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.737006 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f12b30-c71f-4cf5-88b2-06c78ce8243a-serving-cert\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.739422 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f567b4cc-wx87k"] Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.749448 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr"] Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.753222 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b9c9894ff-7ssvr"] Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.837650 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-client-ca\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.837705 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87d41f5c-8722-4115-b4b2-06493d6f18e2-serving-cert\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.837738 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz7s9\" (UniqueName: \"kubernetes.io/projected/07f12b30-c71f-4cf5-88b2-06c78ce8243a-kube-api-access-wz7s9\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.837756 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-config\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.837779 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-proxy-ca-bundles\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.837798 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbm5f\" (UniqueName: \"kubernetes.io/projected/87d41f5c-8722-4115-b4b2-06493d6f18e2-kube-api-access-hbm5f\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.837817 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-config\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.837835 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f12b30-c71f-4cf5-88b2-06c78ce8243a-serving-cert\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.837866 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-client-ca\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.839161 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-client-ca\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.839865 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-config\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.839870 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-proxy-ca-bundles\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.840018 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-config\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.840329 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-client-ca\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.844880 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f12b30-c71f-4cf5-88b2-06c78ce8243a-serving-cert\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.847583 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87d41f5c-8722-4115-b4b2-06493d6f18e2-serving-cert\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.865265 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz7s9\" (UniqueName: \"kubernetes.io/projected/07f12b30-c71f-4cf5-88b2-06c78ce8243a-kube-api-access-wz7s9\") pod \"controller-manager-7568f5d7c4-7jqxn\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:13 crc kubenswrapper[4732]: I0131 09:06:13.865753 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbm5f\" (UniqueName: \"kubernetes.io/projected/87d41f5c-8722-4115-b4b2-06493d6f18e2-kube-api-access-hbm5f\") pod \"route-controller-manager-6685f4fd5b-k4npd\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.008761 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.030536 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.247454 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd"] Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.347731 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn"] Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.548840 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59cbb5c4-c743-47e1-8dc3-e4be5ddd3594" path="/var/lib/kubelet/pods/59cbb5c4-c743-47e1-8dc3-e4be5ddd3594/volumes" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.550010 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de12e404-51e0-4b46-939c-3e4d4f9fbe13" path="/var/lib/kubelet/pods/de12e404-51e0-4b46-939c-3e4d4f9fbe13/volumes" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.631193 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" event={"ID":"87d41f5c-8722-4115-b4b2-06493d6f18e2","Type":"ContainerStarted","Data":"20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8"} Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.631248 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" event={"ID":"87d41f5c-8722-4115-b4b2-06493d6f18e2","Type":"ContainerStarted","Data":"827277a6a635c6307caf7bd68b15efb68e2fe5da6e8b51a1e5a77e41d83b1851"} Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.631429 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.643702 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" event={"ID":"07f12b30-c71f-4cf5-88b2-06c78ce8243a","Type":"ContainerStarted","Data":"46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c"} Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.643894 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" event={"ID":"07f12b30-c71f-4cf5-88b2-06c78ce8243a","Type":"ContainerStarted","Data":"89c7a5bad7e33eac41492a051d188754a5f8894d0eb4a7eb8aeaa6a70e8cf9f2"} Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.644837 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.649881 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.656094 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" podStartSLOduration=2.656072475 podStartE2EDuration="2.656072475s" podCreationTimestamp="2026-01-31 09:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:14.65108565 +0000 UTC m=+312.956961854" watchObservedRunningTime="2026-01-31 09:06:14.656072475 +0000 UTC m=+312.961948679" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.667496 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" podStartSLOduration=2.667480383 podStartE2EDuration="2.667480383s" podCreationTimestamp="2026-01-31 09:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:14.664975951 +0000 UTC m=+312.970852155" watchObservedRunningTime="2026-01-31 09:06:14.667480383 +0000 UTC m=+312.973356587" Jan 31 09:06:14 crc kubenswrapper[4732]: I0131 09:06:14.862552 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.140982 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tr5nx"] Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.143197 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.155696 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tr5nx"] Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.245104 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-registry-certificates\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.245175 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnjrv\" (UniqueName: \"kubernetes.io/projected/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-kube-api-access-xnjrv\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.245227 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.245261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.245291 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-registry-tls\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.245746 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-bound-sa-token\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.245848 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.245915 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-trusted-ca\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.269159 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.347225 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-bound-sa-token\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.347293 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.347328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-trusted-ca\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.347352 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-registry-certificates\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.347388 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnjrv\" (UniqueName: \"kubernetes.io/projected/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-kube-api-access-xnjrv\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.347421 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.347454 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-registry-tls\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.348206 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.348951 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-registry-certificates\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.349142 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-trusted-ca\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.354334 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.357202 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-registry-tls\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.365224 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnjrv\" (UniqueName: \"kubernetes.io/projected/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-kube-api-access-xnjrv\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.366738 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b-bound-sa-token\") pod \"image-registry-66df7c8f76-tr5nx\" (UID: \"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b\") " pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.499091 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:21 crc kubenswrapper[4732]: I0131 09:06:21.912103 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tr5nx"] Jan 31 09:06:22 crc kubenswrapper[4732]: I0131 09:06:22.688193 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" event={"ID":"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b","Type":"ContainerStarted","Data":"e173d17a098c3393888c2a9997cdb0ed89bf1e06903224299483c05122bf1637"} Jan 31 09:06:22 crc kubenswrapper[4732]: I0131 09:06:22.688580 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" event={"ID":"819ce8a1-8a1b-4e4d-af2a-dbf4f2b30e0b","Type":"ContainerStarted","Data":"94ba3fc88647739e6f295d286d60a1d1ef59ffd73b743387bfa28e77b020de4b"} Jan 31 09:06:22 crc kubenswrapper[4732]: I0131 09:06:22.688635 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:22 crc kubenswrapper[4732]: I0131 09:06:22.711590 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" podStartSLOduration=1.711561454 podStartE2EDuration="1.711561454s" podCreationTimestamp="2026-01-31 09:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:22.711274294 +0000 UTC m=+321.017150508" watchObservedRunningTime="2026-01-31 09:06:22.711561454 +0000 UTC m=+321.017437658" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.090648 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn"] Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.091183 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" podUID="07f12b30-c71f-4cf5-88b2-06c78ce8243a" containerName="controller-manager" containerID="cri-o://46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c" gracePeriod=30 Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.594725 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.631016 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-client-ca\") pod \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.631058 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-config\") pod \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.631186 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-proxy-ca-bundles\") pod \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.631222 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz7s9\" (UniqueName: \"kubernetes.io/projected/07f12b30-c71f-4cf5-88b2-06c78ce8243a-kube-api-access-wz7s9\") pod \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.631250 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f12b30-c71f-4cf5-88b2-06c78ce8243a-serving-cert\") pod \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\" (UID: \"07f12b30-c71f-4cf5-88b2-06c78ce8243a\") " Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.633194 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-config" (OuterVolumeSpecName: "config") pod "07f12b30-c71f-4cf5-88b2-06c78ce8243a" (UID: "07f12b30-c71f-4cf5-88b2-06c78ce8243a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.633574 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-client-ca" (OuterVolumeSpecName: "client-ca") pod "07f12b30-c71f-4cf5-88b2-06c78ce8243a" (UID: "07f12b30-c71f-4cf5-88b2-06c78ce8243a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.634182 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "07f12b30-c71f-4cf5-88b2-06c78ce8243a" (UID: "07f12b30-c71f-4cf5-88b2-06c78ce8243a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.638348 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f12b30-c71f-4cf5-88b2-06c78ce8243a-kube-api-access-wz7s9" (OuterVolumeSpecName: "kube-api-access-wz7s9") pod "07f12b30-c71f-4cf5-88b2-06c78ce8243a" (UID: "07f12b30-c71f-4cf5-88b2-06c78ce8243a"). InnerVolumeSpecName "kube-api-access-wz7s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.640160 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f12b30-c71f-4cf5-88b2-06c78ce8243a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "07f12b30-c71f-4cf5-88b2-06c78ce8243a" (UID: "07f12b30-c71f-4cf5-88b2-06c78ce8243a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.725371 4732 generic.go:334] "Generic (PLEG): container finished" podID="07f12b30-c71f-4cf5-88b2-06c78ce8243a" containerID="46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c" exitCode=0 Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.725419 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" event={"ID":"07f12b30-c71f-4cf5-88b2-06c78ce8243a","Type":"ContainerDied","Data":"46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c"} Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.725458 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" event={"ID":"07f12b30-c71f-4cf5-88b2-06c78ce8243a","Type":"ContainerDied","Data":"89c7a5bad7e33eac41492a051d188754a5f8894d0eb4a7eb8aeaa6a70e8cf9f2"} Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.725477 4732 scope.go:117] "RemoveContainer" containerID="46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.725485 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.734889 4732 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.735082 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz7s9\" (UniqueName: \"kubernetes.io/projected/07f12b30-c71f-4cf5-88b2-06c78ce8243a-kube-api-access-wz7s9\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.735142 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f12b30-c71f-4cf5-88b2-06c78ce8243a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.735191 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.735241 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f12b30-c71f-4cf5-88b2-06c78ce8243a-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.742530 4732 scope.go:117] "RemoveContainer" containerID="46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c" Jan 31 09:06:27 crc kubenswrapper[4732]: E0131 09:06:27.742917 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c\": container with ID starting with 46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c not found: ID does not exist" containerID="46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.742956 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c"} err="failed to get container status \"46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c\": rpc error: code = NotFound desc = could not find container \"46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c\": container with ID starting with 46b96dd899c9517fcc86481d9a15637e4e6eede11894fb27964aa47c819db70c not found: ID does not exist" Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.785563 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn"] Jan 31 09:06:27 crc kubenswrapper[4732]: I0131 09:06:27.793001 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7568f5d7c4-7jqxn"] Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.551179 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f12b30-c71f-4cf5-88b2-06c78ce8243a" path="/var/lib/kubelet/pods/07f12b30-c71f-4cf5-88b2-06c78ce8243a/volumes" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.649414 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77cc4f456b-9m7qc"] Jan 31 09:06:28 crc kubenswrapper[4732]: E0131 09:06:28.649687 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f12b30-c71f-4cf5-88b2-06c78ce8243a" containerName="controller-manager" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.649716 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f12b30-c71f-4cf5-88b2-06c78ce8243a" containerName="controller-manager" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.649834 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f12b30-c71f-4cf5-88b2-06c78ce8243a" containerName="controller-manager" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.650252 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.657523 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.657708 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.657807 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.658056 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.658282 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.658542 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.665472 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.665922 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77cc4f456b-9m7qc"] Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.748584 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-serving-cert\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.748638 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-client-ca\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.748704 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-proxy-ca-bundles\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.748752 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwxs6\" (UniqueName: \"kubernetes.io/projected/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-kube-api-access-qwxs6\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.748860 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-config\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.850935 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-config\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.851015 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-serving-cert\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.851083 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-client-ca\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.851139 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-proxy-ca-bundles\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.851173 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwxs6\" (UniqueName: \"kubernetes.io/projected/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-kube-api-access-qwxs6\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.852432 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-proxy-ca-bundles\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.852478 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-client-ca\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.853247 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-config\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.855925 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-serving-cert\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.874457 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwxs6\" (UniqueName: \"kubernetes.io/projected/5be9f2ca-96a2-4e31-8ee7-6848e08c1833-kube-api-access-qwxs6\") pod \"controller-manager-77cc4f456b-9m7qc\" (UID: \"5be9f2ca-96a2-4e31-8ee7-6848e08c1833\") " pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:28 crc kubenswrapper[4732]: I0131 09:06:28.972325 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:29 crc kubenswrapper[4732]: I0131 09:06:29.387318 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77cc4f456b-9m7qc"] Jan 31 09:06:29 crc kubenswrapper[4732]: W0131 09:06:29.395519 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be9f2ca_96a2_4e31_8ee7_6848e08c1833.slice/crio-13021a9007deab986c417d6f9474668c43e92c709e87b1c73c8b29ff4042a64d WatchSource:0}: Error finding container 13021a9007deab986c417d6f9474668c43e92c709e87b1c73c8b29ff4042a64d: Status 404 returned error can't find the container with id 13021a9007deab986c417d6f9474668c43e92c709e87b1c73c8b29ff4042a64d Jan 31 09:06:29 crc kubenswrapper[4732]: I0131 09:06:29.742887 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" event={"ID":"5be9f2ca-96a2-4e31-8ee7-6848e08c1833","Type":"ContainerStarted","Data":"d01d1e298dbf72d8fa542814429bb8e1c8f2cafa02648a97d342a9131ae4ceb2"} Jan 31 09:06:29 crc kubenswrapper[4732]: I0131 09:06:29.744152 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:29 crc kubenswrapper[4732]: I0131 09:06:29.744215 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" event={"ID":"5be9f2ca-96a2-4e31-8ee7-6848e08c1833","Type":"ContainerStarted","Data":"13021a9007deab986c417d6f9474668c43e92c709e87b1c73c8b29ff4042a64d"} Jan 31 09:06:29 crc kubenswrapper[4732]: I0131 09:06:29.752637 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" Jan 31 09:06:29 crc kubenswrapper[4732]: I0131 09:06:29.763440 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77cc4f456b-9m7qc" podStartSLOduration=2.763419639 podStartE2EDuration="2.763419639s" podCreationTimestamp="2026-01-31 09:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:29.758561237 +0000 UTC m=+328.064437461" watchObservedRunningTime="2026-01-31 09:06:29.763419639 +0000 UTC m=+328.069295843" Jan 31 09:06:41 crc kubenswrapper[4732]: I0131 09:06:41.507824 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tr5nx" Jan 31 09:06:41 crc kubenswrapper[4732]: I0131 09:06:41.564979 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-99dtb"] Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.133016 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd"] Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.133822 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" podUID="87d41f5c-8722-4115-b4b2-06493d6f18e2" containerName="route-controller-manager" containerID="cri-o://20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8" gracePeriod=30 Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.770001 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.817789 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87d41f5c-8722-4115-b4b2-06493d6f18e2-serving-cert\") pod \"87d41f5c-8722-4115-b4b2-06493d6f18e2\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.817849 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-client-ca\") pod \"87d41f5c-8722-4115-b4b2-06493d6f18e2\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.817892 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbm5f\" (UniqueName: \"kubernetes.io/projected/87d41f5c-8722-4115-b4b2-06493d6f18e2-kube-api-access-hbm5f\") pod \"87d41f5c-8722-4115-b4b2-06493d6f18e2\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.817930 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-config\") pod \"87d41f5c-8722-4115-b4b2-06493d6f18e2\" (UID: \"87d41f5c-8722-4115-b4b2-06493d6f18e2\") " Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.818699 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-client-ca" (OuterVolumeSpecName: "client-ca") pod "87d41f5c-8722-4115-b4b2-06493d6f18e2" (UID: "87d41f5c-8722-4115-b4b2-06493d6f18e2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.818790 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-config" (OuterVolumeSpecName: "config") pod "87d41f5c-8722-4115-b4b2-06493d6f18e2" (UID: "87d41f5c-8722-4115-b4b2-06493d6f18e2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.827577 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87d41f5c-8722-4115-b4b2-06493d6f18e2-kube-api-access-hbm5f" (OuterVolumeSpecName: "kube-api-access-hbm5f") pod "87d41f5c-8722-4115-b4b2-06493d6f18e2" (UID: "87d41f5c-8722-4115-b4b2-06493d6f18e2"). InnerVolumeSpecName "kube-api-access-hbm5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.829751 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87d41f5c-8722-4115-b4b2-06493d6f18e2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "87d41f5c-8722-4115-b4b2-06493d6f18e2" (UID: "87d41f5c-8722-4115-b4b2-06493d6f18e2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.843481 4732 generic.go:334] "Generic (PLEG): container finished" podID="87d41f5c-8722-4115-b4b2-06493d6f18e2" containerID="20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8" exitCode=0 Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.843555 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.843586 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" event={"ID":"87d41f5c-8722-4115-b4b2-06493d6f18e2","Type":"ContainerDied","Data":"20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8"} Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.843636 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd" event={"ID":"87d41f5c-8722-4115-b4b2-06493d6f18e2","Type":"ContainerDied","Data":"827277a6a635c6307caf7bd68b15efb68e2fe5da6e8b51a1e5a77e41d83b1851"} Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.843712 4732 scope.go:117] "RemoveContainer" containerID="20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.867135 4732 scope.go:117] "RemoveContainer" containerID="20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8" Jan 31 09:06:47 crc kubenswrapper[4732]: E0131 09:06:47.867560 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8\": container with ID starting with 20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8 not found: ID does not exist" containerID="20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.867621 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8"} err="failed to get container status \"20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8\": rpc error: code = NotFound desc = could not find container \"20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8\": container with ID starting with 20b470d8cff837870ae5ee797e7d03af879664119847524316640e3d39a1bbb8 not found: ID does not exist" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.881788 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd"] Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.883973 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6685f4fd5b-k4npd"] Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.919611 4732 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87d41f5c-8722-4115-b4b2-06493d6f18e2-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.919654 4732 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.919714 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbm5f\" (UniqueName: \"kubernetes.io/projected/87d41f5c-8722-4115-b4b2-06493d6f18e2-kube-api-access-hbm5f\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:47 crc kubenswrapper[4732]: I0131 09:06:47.919729 4732 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87d41f5c-8722-4115-b4b2-06493d6f18e2-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.556222 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d41f5c-8722-4115-b4b2-06493d6f18e2" path="/var/lib/kubelet/pods/87d41f5c-8722-4115-b4b2-06493d6f18e2/volumes" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.669002 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf"] Jan 31 09:06:48 crc kubenswrapper[4732]: E0131 09:06:48.669345 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d41f5c-8722-4115-b4b2-06493d6f18e2" containerName="route-controller-manager" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.669376 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d41f5c-8722-4115-b4b2-06493d6f18e2" containerName="route-controller-manager" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.669577 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="87d41f5c-8722-4115-b4b2-06493d6f18e2" containerName="route-controller-manager" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.670222 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.681320 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.681428 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.681944 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.682498 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.691246 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.699734 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.703131 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf"] Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.730091 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3219d76-a8c9-4166-b326-cf1cd4a31074-client-ca\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.730183 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ms8d\" (UniqueName: \"kubernetes.io/projected/a3219d76-a8c9-4166-b326-cf1cd4a31074-kube-api-access-4ms8d\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.730251 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3219d76-a8c9-4166-b326-cf1cd4a31074-serving-cert\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.730333 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3219d76-a8c9-4166-b326-cf1cd4a31074-config\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.831275 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3219d76-a8c9-4166-b326-cf1cd4a31074-client-ca\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.831328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ms8d\" (UniqueName: \"kubernetes.io/projected/a3219d76-a8c9-4166-b326-cf1cd4a31074-kube-api-access-4ms8d\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.831373 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3219d76-a8c9-4166-b326-cf1cd4a31074-serving-cert\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.831422 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3219d76-a8c9-4166-b326-cf1cd4a31074-config\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.832587 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a3219d76-a8c9-4166-b326-cf1cd4a31074-client-ca\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.832747 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3219d76-a8c9-4166-b326-cf1cd4a31074-config\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.838438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3219d76-a8c9-4166-b326-cf1cd4a31074-serving-cert\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.851856 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ms8d\" (UniqueName: \"kubernetes.io/projected/a3219d76-a8c9-4166-b326-cf1cd4a31074-kube-api-access-4ms8d\") pod \"route-controller-manager-d99dc646-4q8hf\" (UID: \"a3219d76-a8c9-4166-b326-cf1cd4a31074\") " pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:48 crc kubenswrapper[4732]: I0131 09:06:48.989553 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:49 crc kubenswrapper[4732]: I0131 09:06:49.403801 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf"] Jan 31 09:06:49 crc kubenswrapper[4732]: W0131 09:06:49.413829 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3219d76_a8c9_4166_b326_cf1cd4a31074.slice/crio-bfc06e59c07ca0dcc545078158eb5f8758f312a45261a6fde0df305d70df7eb5 WatchSource:0}: Error finding container bfc06e59c07ca0dcc545078158eb5f8758f312a45261a6fde0df305d70df7eb5: Status 404 returned error can't find the container with id bfc06e59c07ca0dcc545078158eb5f8758f312a45261a6fde0df305d70df7eb5 Jan 31 09:06:49 crc kubenswrapper[4732]: I0131 09:06:49.857028 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" event={"ID":"a3219d76-a8c9-4166-b326-cf1cd4a31074","Type":"ContainerStarted","Data":"4a0298efb06452f5c63ab2b3c2798a83000c125a58f1161f38fb855be1f716c9"} Jan 31 09:06:49 crc kubenswrapper[4732]: I0131 09:06:49.857072 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" event={"ID":"a3219d76-a8c9-4166-b326-cf1cd4a31074","Type":"ContainerStarted","Data":"bfc06e59c07ca0dcc545078158eb5f8758f312a45261a6fde0df305d70df7eb5"} Jan 31 09:06:49 crc kubenswrapper[4732]: I0131 09:06:49.857325 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:06:49 crc kubenswrapper[4732]: I0131 09:06:49.872138 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" podStartSLOduration=2.8721209180000002 podStartE2EDuration="2.872120918s" podCreationTimestamp="2026-01-31 09:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:06:49.871455708 +0000 UTC m=+348.177331922" watchObservedRunningTime="2026-01-31 09:06:49.872120918 +0000 UTC m=+348.177997122" Jan 31 09:06:50 crc kubenswrapper[4732]: I0131 09:06:50.023390 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d99dc646-4q8hf" Jan 31 09:07:06 crc kubenswrapper[4732]: I0131 09:07:06.611173 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" podUID="4ac602fa-14af-4ae0-a538-d73e938db036" containerName="registry" containerID="cri-o://76229ced9d7ea551eb476a7db3e9648ede271e5ea762d59f5c012cdd16284033" gracePeriod=30 Jan 31 09:07:06 crc kubenswrapper[4732]: I0131 09:07:06.957276 4732 generic.go:334] "Generic (PLEG): container finished" podID="4ac602fa-14af-4ae0-a538-d73e938db036" containerID="76229ced9d7ea551eb476a7db3e9648ede271e5ea762d59f5c012cdd16284033" exitCode=0 Jan 31 09:07:06 crc kubenswrapper[4732]: I0131 09:07:06.957476 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" event={"ID":"4ac602fa-14af-4ae0-a538-d73e938db036","Type":"ContainerDied","Data":"76229ced9d7ea551eb476a7db3e9648ede271e5ea762d59f5c012cdd16284033"} Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.026473 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.172569 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-bound-sa-token\") pod \"4ac602fa-14af-4ae0-a538-d73e938db036\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.172723 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ac602fa-14af-4ae0-a538-d73e938db036-installation-pull-secrets\") pod \"4ac602fa-14af-4ae0-a538-d73e938db036\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.172828 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb2p9\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-kube-api-access-bb2p9\") pod \"4ac602fa-14af-4ae0-a538-d73e938db036\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.172880 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ac602fa-14af-4ae0-a538-d73e938db036-ca-trust-extracted\") pod \"4ac602fa-14af-4ae0-a538-d73e938db036\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.173036 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4ac602fa-14af-4ae0-a538-d73e938db036\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.173070 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-trusted-ca\") pod \"4ac602fa-14af-4ae0-a538-d73e938db036\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.173096 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-registry-tls\") pod \"4ac602fa-14af-4ae0-a538-d73e938db036\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.173140 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-registry-certificates\") pod \"4ac602fa-14af-4ae0-a538-d73e938db036\" (UID: \"4ac602fa-14af-4ae0-a538-d73e938db036\") " Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.174048 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4ac602fa-14af-4ae0-a538-d73e938db036" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.174450 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4ac602fa-14af-4ae0-a538-d73e938db036" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.178024 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac602fa-14af-4ae0-a538-d73e938db036-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4ac602fa-14af-4ae0-a538-d73e938db036" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.178409 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4ac602fa-14af-4ae0-a538-d73e938db036" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.178886 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-kube-api-access-bb2p9" (OuterVolumeSpecName: "kube-api-access-bb2p9") pod "4ac602fa-14af-4ae0-a538-d73e938db036" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036"). InnerVolumeSpecName "kube-api-access-bb2p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.179181 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4ac602fa-14af-4ae0-a538-d73e938db036" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.195549 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4ac602fa-14af-4ae0-a538-d73e938db036" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.200311 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac602fa-14af-4ae0-a538-d73e938db036-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4ac602fa-14af-4ae0-a538-d73e938db036" (UID: "4ac602fa-14af-4ae0-a538-d73e938db036"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.274776 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb2p9\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-kube-api-access-bb2p9\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.274814 4732 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4ac602fa-14af-4ae0-a538-d73e938db036-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.274823 4732 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.274831 4732 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.274839 4732 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4ac602fa-14af-4ae0-a538-d73e938db036-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.274847 4732 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ac602fa-14af-4ae0-a538-d73e938db036-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.274854 4732 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4ac602fa-14af-4ae0-a538-d73e938db036-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.965476 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.965461 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-99dtb" event={"ID":"4ac602fa-14af-4ae0-a538-d73e938db036","Type":"ContainerDied","Data":"039e3167eddd030475a90d04176c40d3799eaa481a42473d70722bc67a78215e"} Jan 31 09:07:07 crc kubenswrapper[4732]: I0131 09:07:07.966033 4732 scope.go:117] "RemoveContainer" containerID="76229ced9d7ea551eb476a7db3e9648ede271e5ea762d59f5c012cdd16284033" Jan 31 09:07:08 crc kubenswrapper[4732]: I0131 09:07:08.001227 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-99dtb"] Jan 31 09:07:08 crc kubenswrapper[4732]: I0131 09:07:08.005479 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-99dtb"] Jan 31 09:07:08 crc kubenswrapper[4732]: I0131 09:07:08.550058 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ac602fa-14af-4ae0-a538-d73e938db036" path="/var/lib/kubelet/pods/4ac602fa-14af-4ae0-a538-d73e938db036/volumes" Jan 31 09:07:17 crc kubenswrapper[4732]: I0131 09:07:17.498132 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:07:17 crc kubenswrapper[4732]: I0131 09:07:17.498534 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:07:47 crc kubenswrapper[4732]: I0131 09:07:47.498204 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:07:47 crc kubenswrapper[4732]: I0131 09:07:47.498651 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:08:17 crc kubenswrapper[4732]: I0131 09:08:17.497948 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:08:17 crc kubenswrapper[4732]: I0131 09:08:17.498518 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:08:17 crc kubenswrapper[4732]: I0131 09:08:17.498579 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:08:17 crc kubenswrapper[4732]: I0131 09:08:17.499164 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a1af67f6e9c90030eed50fdab77c62259e76a7813864bb504390768e9501756"} pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:08:17 crc kubenswrapper[4732]: I0131 09:08:17.499223 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" containerID="cri-o://1a1af67f6e9c90030eed50fdab77c62259e76a7813864bb504390768e9501756" gracePeriod=600 Jan 31 09:08:18 crc kubenswrapper[4732]: I0131 09:08:18.402637 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d790207-d357-4b47-87bf-5b505e061820" containerID="1a1af67f6e9c90030eed50fdab77c62259e76a7813864bb504390768e9501756" exitCode=0 Jan 31 09:08:18 crc kubenswrapper[4732]: I0131 09:08:18.402720 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerDied","Data":"1a1af67f6e9c90030eed50fdab77c62259e76a7813864bb504390768e9501756"} Jan 31 09:08:18 crc kubenswrapper[4732]: I0131 09:08:18.403003 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"942a11834ff55816d19ec94b72706370701e25dcee37029bb97b73b2e3078f9b"} Jan 31 09:08:18 crc kubenswrapper[4732]: I0131 09:08:18.403034 4732 scope.go:117] "RemoveContainer" containerID="ad1f906b34fddd34efd9d099479cd1fe7404da4ab11f3f571f9e3120167505a5" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.300425 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8mtkt"] Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.301293 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovn-controller" containerID="cri-o://792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8" gracePeriod=30 Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.301657 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="sbdb" containerID="cri-o://4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715" gracePeriod=30 Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.301731 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="nbdb" containerID="cri-o://6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db" gracePeriod=30 Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.301767 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="northd" containerID="cri-o://c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20" gracePeriod=30 Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.301796 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b" gracePeriod=30 Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.301822 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kube-rbac-proxy-node" containerID="cri-o://8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a" gracePeriod=30 Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.301849 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovn-acl-logging" containerID="cri-o://075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81" gracePeriod=30 Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.342319 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" containerID="cri-o://b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4" gracePeriod=30 Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.626850 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/3.log" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.629105 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovn-acl-logging/0.log" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.629617 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovn-controller/0.log" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.630114 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681501 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bqvt7"] Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681715 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kube-rbac-proxy-node" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681733 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kube-rbac-proxy-node" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681748 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac602fa-14af-4ae0-a538-d73e938db036" containerName="registry" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681754 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac602fa-14af-4ae0-a538-d73e938db036" containerName="registry" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681764 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="nbdb" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681770 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="nbdb" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681780 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681787 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681794 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681800 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681807 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovn-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681813 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovn-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681820 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681826 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681832 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovn-acl-logging" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681838 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovn-acl-logging" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681845 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kubecfg-setup" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681851 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kubecfg-setup" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681859 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681865 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681871 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="sbdb" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681877 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="sbdb" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681883 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="northd" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681889 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="northd" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.681898 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.681904 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682000 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="sbdb" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682012 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="nbdb" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682020 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovn-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682029 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682037 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovn-acl-logging" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682046 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682053 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682061 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682072 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac602fa-14af-4ae0-a538-d73e938db036" containerName="registry" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682080 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682089 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="kube-rbac-proxy-node" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682099 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="northd" Jan 31 09:10:07 crc kubenswrapper[4732]: E0131 09:10:07.682197 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682207 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.682290 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerName="ovnkube-controller" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.683781 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770013 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-netd\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770107 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-slash\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770159 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770207 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-config\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770229 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-slash" (OuterVolumeSpecName: "host-slash") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770258 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-kubelet\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770321 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-systemd\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770373 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770382 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770446 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-env-overrides\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770487 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-openvswitch\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770547 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jktvz\" (UniqueName: \"kubernetes.io/projected/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-kube-api-access-jktvz\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770598 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-var-lib-openvswitch\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770537 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770588 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770638 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-netns\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770732 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770762 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-script-lib\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770789 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-ovn-kubernetes\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770806 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-node-log\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770807 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770870 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-bin\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770867 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770893 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovn-node-metrics-cert\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770911 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-systemd-units\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770919 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-node-log" (OuterVolumeSpecName: "node-log") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770927 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-log-socket\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770950 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-log-socket" (OuterVolumeSpecName: "log-socket") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770976 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.770994 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-ovn\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771015 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771014 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771040 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-etc-openvswitch\") pod \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\" (UID: \"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8\") " Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771045 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771082 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771144 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771203 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771240 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-run-netns\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771277 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-systemd-units\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771302 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-slash\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771322 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-ovnkube-config\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771391 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrxqq\" (UniqueName: \"kubernetes.io/projected/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-kube-api-access-mrxqq\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771421 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-kubelet\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771466 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-run-openvswitch\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771487 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-ovn-node-metrics-cert\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.771653 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-cni-netd\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772082 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-ovnkube-script-lib\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772202 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-etc-openvswitch\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772307 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-log-socket\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772360 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772410 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-env-overrides\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772469 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772499 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-run-ovn\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772557 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-var-lib-openvswitch\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772578 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-node-log\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772603 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-run-systemd\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772624 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-cni-bin\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772754 4732 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772771 4732 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772784 4732 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772795 4732 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-slash\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772809 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772820 4732 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772831 4732 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772844 4732 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772855 4732 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772866 4732 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772878 4732 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772889 4732 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772901 4732 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772914 4732 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-node-log\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772951 4732 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772963 4732 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.772973 4732 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-log-socket\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.776097 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-kube-api-access-jktvz" (OuterVolumeSpecName: "kube-api-access-jktvz") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "kube-api-access-jktvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.777328 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.785735 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" (UID: "82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875461 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrxqq\" (UniqueName: \"kubernetes.io/projected/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-kube-api-access-mrxqq\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875535 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-kubelet\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875589 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-run-openvswitch\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875609 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-ovn-node-metrics-cert\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875652 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-cni-netd\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875683 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-ovnkube-script-lib\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875704 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-etc-openvswitch\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875730 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-log-socket\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875728 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-kubelet\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875787 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875748 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875812 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-cni-netd\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875830 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-env-overrides\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875847 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-run-openvswitch\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875862 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875887 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-run-ovn\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875915 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-var-lib-openvswitch\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875931 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-node-log\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875925 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-etc-openvswitch\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875963 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-run-systemd\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-run-systemd\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.875988 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-log-socket\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876002 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-cni-bin\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876036 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-run-netns\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876065 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-systemd-units\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876093 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-slash\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876120 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-ovnkube-config\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876217 4732 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876230 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jktvz\" (UniqueName: \"kubernetes.io/projected/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-kube-api-access-jktvz\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876240 4732 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876455 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-run-ovn\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876494 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876520 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-run-netns\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876544 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-var-lib-openvswitch\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876589 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-systemd-units\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876623 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-slash\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876626 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-node-log\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876683 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-env-overrides\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876709 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-host-cni-bin\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.876975 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-ovnkube-config\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.877106 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-ovnkube-script-lib\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.879136 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-ovn-node-metrics-cert\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:07 crc kubenswrapper[4732]: I0131 09:10:07.889994 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrxqq\" (UniqueName: \"kubernetes.io/projected/9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5-kube-api-access-mrxqq\") pod \"ovnkube-node-bqvt7\" (UID: \"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.006259 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.031228 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerStarted","Data":"665c5e4eed75a7701e236236d51ecb4c2a2ec042f26ac339fa76cfae9e3def62"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.033767 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovnkube-controller/3.log" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.037325 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovn-acl-logging/0.log" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.037987 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8mtkt_82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/ovn-controller/0.log" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038346 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4" exitCode=0 Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038377 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715" exitCode=0 Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038387 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db" exitCode=0 Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038399 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20" exitCode=0 Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038408 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b" exitCode=0 Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038419 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a" exitCode=0 Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038430 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81" exitCode=143 Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038439 4732 generic.go:334] "Generic (PLEG): container finished" podID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" containerID="792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8" exitCode=143 Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038432 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038484 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038514 4732 scope.go:117] "RemoveContainer" containerID="b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038488 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038597 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038615 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038627 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038640 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038653 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038689 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038697 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038704 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038711 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038720 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038727 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038734 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038741 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038752 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038765 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038774 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038782 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038790 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038797 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038804 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038811 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038818 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038824 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038832 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038841 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038853 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038863 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038870 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038879 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038886 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038894 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038901 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038908 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038915 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038922 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8mtkt" event={"ID":"82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8","Type":"ContainerDied","Data":"8ed5be886bc7763adb1d7a0a054a6dd73cde6a707faa32148f1f5ddc889335e4"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038943 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038952 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038959 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038967 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038974 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038981 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038988 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.038994 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.039001 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.039009 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.043045 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/2.log" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.043610 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/1.log" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.043652 4732 generic.go:334] "Generic (PLEG): container finished" podID="8e23192f-14db-41ef-af89-4a76e325d9c1" containerID="98e5c23e9a8bde55626defc76f20f8510954c4ef79d762950e0790a7de4dce4f" exitCode=2 Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.043727 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mxsr" event={"ID":"8e23192f-14db-41ef-af89-4a76e325d9c1","Type":"ContainerDied","Data":"98e5c23e9a8bde55626defc76f20f8510954c4ef79d762950e0790a7de4dce4f"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.043750 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617"} Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.044356 4732 scope.go:117] "RemoveContainer" containerID="98e5c23e9a8bde55626defc76f20f8510954c4ef79d762950e0790a7de4dce4f" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.044581 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4mxsr_openshift-multus(8e23192f-14db-41ef-af89-4a76e325d9c1)\"" pod="openshift-multus/multus-4mxsr" podUID="8e23192f-14db-41ef-af89-4a76e325d9c1" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.090598 4732 scope.go:117] "RemoveContainer" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.100662 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8mtkt"] Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.105452 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8mtkt"] Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.114140 4732 scope.go:117] "RemoveContainer" containerID="4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.125626 4732 scope.go:117] "RemoveContainer" containerID="6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.136466 4732 scope.go:117] "RemoveContainer" containerID="c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.146687 4732 scope.go:117] "RemoveContainer" containerID="8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.159220 4732 scope.go:117] "RemoveContainer" containerID="8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.170253 4732 scope.go:117] "RemoveContainer" containerID="075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.181203 4732 scope.go:117] "RemoveContainer" containerID="792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.192370 4732 scope.go:117] "RemoveContainer" containerID="cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.203654 4732 scope.go:117] "RemoveContainer" containerID="b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.204080 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": container with ID starting with b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4 not found: ID does not exist" containerID="b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.204126 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} err="failed to get container status \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": rpc error: code = NotFound desc = could not find container \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": container with ID starting with b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.204157 4732 scope.go:117] "RemoveContainer" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.204415 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\": container with ID starting with b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1 not found: ID does not exist" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.204465 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} err="failed to get container status \"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\": rpc error: code = NotFound desc = could not find container \"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\": container with ID starting with b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.204503 4732 scope.go:117] "RemoveContainer" containerID="4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.204950 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\": container with ID starting with 4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715 not found: ID does not exist" containerID="4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.204980 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} err="failed to get container status \"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\": rpc error: code = NotFound desc = could not find container \"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\": container with ID starting with 4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.205001 4732 scope.go:117] "RemoveContainer" containerID="6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.205239 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\": container with ID starting with 6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db not found: ID does not exist" containerID="6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.205281 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} err="failed to get container status \"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\": rpc error: code = NotFound desc = could not find container \"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\": container with ID starting with 6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.205308 4732 scope.go:117] "RemoveContainer" containerID="c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.205589 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\": container with ID starting with c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20 not found: ID does not exist" containerID="c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.205626 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} err="failed to get container status \"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\": rpc error: code = NotFound desc = could not find container \"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\": container with ID starting with c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.205648 4732 scope.go:117] "RemoveContainer" containerID="8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.205918 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\": container with ID starting with 8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b not found: ID does not exist" containerID="8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.205941 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} err="failed to get container status \"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\": rpc error: code = NotFound desc = could not find container \"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\": container with ID starting with 8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.205954 4732 scope.go:117] "RemoveContainer" containerID="8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.206150 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\": container with ID starting with 8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a not found: ID does not exist" containerID="8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.206170 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} err="failed to get container status \"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\": rpc error: code = NotFound desc = could not find container \"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\": container with ID starting with 8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.206189 4732 scope.go:117] "RemoveContainer" containerID="075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.206459 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\": container with ID starting with 075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81 not found: ID does not exist" containerID="075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.206486 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} err="failed to get container status \"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\": rpc error: code = NotFound desc = could not find container \"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\": container with ID starting with 075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.206502 4732 scope.go:117] "RemoveContainer" containerID="792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.206729 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\": container with ID starting with 792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8 not found: ID does not exist" containerID="792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.206762 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} err="failed to get container status \"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\": rpc error: code = NotFound desc = could not find container \"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\": container with ID starting with 792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.206782 4732 scope.go:117] "RemoveContainer" containerID="cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193" Jan 31 09:10:08 crc kubenswrapper[4732]: E0131 09:10:08.207070 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\": container with ID starting with cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193 not found: ID does not exist" containerID="cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.207095 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193"} err="failed to get container status \"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\": rpc error: code = NotFound desc = could not find container \"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\": container with ID starting with cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.207111 4732 scope.go:117] "RemoveContainer" containerID="b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.207348 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} err="failed to get container status \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": rpc error: code = NotFound desc = could not find container \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": container with ID starting with b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.207366 4732 scope.go:117] "RemoveContainer" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.207621 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} err="failed to get container status \"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\": rpc error: code = NotFound desc = could not find container \"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\": container with ID starting with b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.207644 4732 scope.go:117] "RemoveContainer" containerID="4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.207858 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} err="failed to get container status \"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\": rpc error: code = NotFound desc = could not find container \"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\": container with ID starting with 4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.207882 4732 scope.go:117] "RemoveContainer" containerID="6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.208118 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} err="failed to get container status \"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\": rpc error: code = NotFound desc = could not find container \"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\": container with ID starting with 6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.208167 4732 scope.go:117] "RemoveContainer" containerID="c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.208402 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} err="failed to get container status \"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\": rpc error: code = NotFound desc = could not find container \"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\": container with ID starting with c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.208424 4732 scope.go:117] "RemoveContainer" containerID="8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.208624 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} err="failed to get container status \"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\": rpc error: code = NotFound desc = could not find container \"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\": container with ID starting with 8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.208646 4732 scope.go:117] "RemoveContainer" containerID="8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.208859 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} err="failed to get container status \"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\": rpc error: code = NotFound desc = could not find container \"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\": container with ID starting with 8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.208881 4732 scope.go:117] "RemoveContainer" containerID="075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.209105 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} err="failed to get container status \"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\": rpc error: code = NotFound desc = could not find container \"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\": container with ID starting with 075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.209126 4732 scope.go:117] "RemoveContainer" containerID="792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.209405 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} err="failed to get container status \"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\": rpc error: code = NotFound desc = could not find container \"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\": container with ID starting with 792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.209439 4732 scope.go:117] "RemoveContainer" containerID="cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.209686 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193"} err="failed to get container status \"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\": rpc error: code = NotFound desc = could not find container \"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\": container with ID starting with cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.209713 4732 scope.go:117] "RemoveContainer" containerID="b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.210012 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} err="failed to get container status \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": rpc error: code = NotFound desc = could not find container \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": container with ID starting with b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.210043 4732 scope.go:117] "RemoveContainer" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.210297 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} err="failed to get container status \"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\": rpc error: code = NotFound desc = could not find container \"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\": container with ID starting with b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.210321 4732 scope.go:117] "RemoveContainer" containerID="4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.210537 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} err="failed to get container status \"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\": rpc error: code = NotFound desc = could not find container \"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\": container with ID starting with 4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.210564 4732 scope.go:117] "RemoveContainer" containerID="6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.210827 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} err="failed to get container status \"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\": rpc error: code = NotFound desc = could not find container \"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\": container with ID starting with 6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.210851 4732 scope.go:117] "RemoveContainer" containerID="c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.211109 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} err="failed to get container status \"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\": rpc error: code = NotFound desc = could not find container \"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\": container with ID starting with c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.211138 4732 scope.go:117] "RemoveContainer" containerID="8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.211501 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} err="failed to get container status \"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\": rpc error: code = NotFound desc = could not find container \"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\": container with ID starting with 8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.211526 4732 scope.go:117] "RemoveContainer" containerID="8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.211845 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} err="failed to get container status \"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\": rpc error: code = NotFound desc = could not find container \"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\": container with ID starting with 8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.211874 4732 scope.go:117] "RemoveContainer" containerID="075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.212109 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} err="failed to get container status \"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\": rpc error: code = NotFound desc = could not find container \"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\": container with ID starting with 075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.212146 4732 scope.go:117] "RemoveContainer" containerID="792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.212454 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} err="failed to get container status \"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\": rpc error: code = NotFound desc = could not find container \"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\": container with ID starting with 792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.212474 4732 scope.go:117] "RemoveContainer" containerID="cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.212788 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193"} err="failed to get container status \"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\": rpc error: code = NotFound desc = could not find container \"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\": container with ID starting with cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.212827 4732 scope.go:117] "RemoveContainer" containerID="b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.213179 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} err="failed to get container status \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": rpc error: code = NotFound desc = could not find container \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": container with ID starting with b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.213220 4732 scope.go:117] "RemoveContainer" containerID="b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.213513 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1"} err="failed to get container status \"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\": rpc error: code = NotFound desc = could not find container \"b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1\": container with ID starting with b77508f48f518644f99bdda577737a5ed3b909f97a70c1c297c035609ba46fa1 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.213551 4732 scope.go:117] "RemoveContainer" containerID="4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.213970 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715"} err="failed to get container status \"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\": rpc error: code = NotFound desc = could not find container \"4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715\": container with ID starting with 4cd519944f0ae0ee9920a6a54de3dd40c73a65d0c907f625c57d7648a6d0a715 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.214042 4732 scope.go:117] "RemoveContainer" containerID="6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.214331 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db"} err="failed to get container status \"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\": rpc error: code = NotFound desc = could not find container \"6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db\": container with ID starting with 6b26417bd7817c984a59723e2231e3a43c554022aef67eacf300adee0166c3db not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.214364 4732 scope.go:117] "RemoveContainer" containerID="c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.214685 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20"} err="failed to get container status \"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\": rpc error: code = NotFound desc = could not find container \"c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20\": container with ID starting with c79f404e5bc3d6dc83681d68605b1f26565813ccec34c6d36ee9f21686ca4f20 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.214711 4732 scope.go:117] "RemoveContainer" containerID="8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.214955 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b"} err="failed to get container status \"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\": rpc error: code = NotFound desc = could not find container \"8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b\": container with ID starting with 8fde05e46badc11b2174b88e2590402436bb669bcbd017ceb4d11d25066b421b not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.214997 4732 scope.go:117] "RemoveContainer" containerID="8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.215214 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a"} err="failed to get container status \"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\": rpc error: code = NotFound desc = could not find container \"8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a\": container with ID starting with 8521ee1e84f540faa7b093ecd0bb7761ae38487a6c7abf52028fe820ea9a4c1a not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.215268 4732 scope.go:117] "RemoveContainer" containerID="075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.215548 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81"} err="failed to get container status \"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\": rpc error: code = NotFound desc = could not find container \"075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81\": container with ID starting with 075418297d972e23003b3ce2016f7709d138316c9d08837dc484dadf4d7f4a81 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.215574 4732 scope.go:117] "RemoveContainer" containerID="792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.215952 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8"} err="failed to get container status \"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\": rpc error: code = NotFound desc = could not find container \"792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8\": container with ID starting with 792e702a9fa66ea68366e8f88d976f7e280223815c4b70258566c32faaed57a8 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.215977 4732 scope.go:117] "RemoveContainer" containerID="cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.216318 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193"} err="failed to get container status \"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\": rpc error: code = NotFound desc = could not find container \"cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193\": container with ID starting with cfb08df78497d5ae0eb39af9a06378ea80914d9b4d7014dd85d930f54c733193 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.216358 4732 scope.go:117] "RemoveContainer" containerID="b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.216714 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4"} err="failed to get container status \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": rpc error: code = NotFound desc = could not find container \"b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4\": container with ID starting with b950d29df580418b6e7082c9a672624c14f714944ccbf4cd1d3824810d885ac4 not found: ID does not exist" Jan 31 09:10:08 crc kubenswrapper[4732]: I0131 09:10:08.554711 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8" path="/var/lib/kubelet/pods/82d07e8c-9b7d-43fb-ac36-f6c87d08e2c8/volumes" Jan 31 09:10:09 crc kubenswrapper[4732]: I0131 09:10:09.050937 4732 generic.go:334] "Generic (PLEG): container finished" podID="9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5" containerID="a5ff595b08c6520ee5972e06c7feb56e18375d97f3fb9ac92996609011138b32" exitCode=0 Jan 31 09:10:09 crc kubenswrapper[4732]: I0131 09:10:09.051003 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerDied","Data":"a5ff595b08c6520ee5972e06c7feb56e18375d97f3fb9ac92996609011138b32"} Jan 31 09:10:10 crc kubenswrapper[4732]: I0131 09:10:10.059666 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerStarted","Data":"60d12f3e0343937b71ab044b3a53e9f8522e1a6557ce42cdc08adf4be81b8603"} Jan 31 09:10:10 crc kubenswrapper[4732]: I0131 09:10:10.060265 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerStarted","Data":"48385875afd5dbd56191f18499cecf42addff2323fdceb76f9190a506b6833bd"} Jan 31 09:10:10 crc kubenswrapper[4732]: I0131 09:10:10.060280 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerStarted","Data":"0c3c896b657c02178bdb1189e62a629663ed2408a8325116c175852928548768"} Jan 31 09:10:10 crc kubenswrapper[4732]: I0131 09:10:10.060293 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerStarted","Data":"fc268c48d5b8f4a59fdf70f765783e66e5230a80929f88f42499217455dff299"} Jan 31 09:10:11 crc kubenswrapper[4732]: I0131 09:10:11.067009 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerStarted","Data":"159aeda61e93a43c7aa035ca8ebd5fa16fb985dadbd04937f0281edd763f08fa"} Jan 31 09:10:11 crc kubenswrapper[4732]: I0131 09:10:11.067055 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerStarted","Data":"59af69892b2784be32c790368744856b0a20ba343e6c2f5de661a8f6eb56760b"} Jan 31 09:10:13 crc kubenswrapper[4732]: I0131 09:10:13.077834 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerStarted","Data":"049127f38ff4519d495e4383644da5148f88578ca5c97ca910a987c39b795f4a"} Jan 31 09:10:15 crc kubenswrapper[4732]: I0131 09:10:15.096340 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" event={"ID":"9faf3ee0-47c7-4c3e-85cb-d8800f2e26d5","Type":"ContainerStarted","Data":"507caade8ce1849cd8a7ba9cc1874992055408592aa10cb52481ed404e1acd1a"} Jan 31 09:10:15 crc kubenswrapper[4732]: I0131 09:10:15.096953 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:15 crc kubenswrapper[4732]: I0131 09:10:15.097119 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:15 crc kubenswrapper[4732]: I0131 09:10:15.097230 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:15 crc kubenswrapper[4732]: I0131 09:10:15.122173 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:15 crc kubenswrapper[4732]: I0131 09:10:15.122900 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:15 crc kubenswrapper[4732]: I0131 09:10:15.129799 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" podStartSLOduration=8.129781548 podStartE2EDuration="8.129781548s" podCreationTimestamp="2026-01-31 09:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:10:15.124515409 +0000 UTC m=+553.430391623" watchObservedRunningTime="2026-01-31 09:10:15.129781548 +0000 UTC m=+553.435657752" Jan 31 09:10:17 crc kubenswrapper[4732]: I0131 09:10:17.498575 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:10:17 crc kubenswrapper[4732]: I0131 09:10:17.498898 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:10:20 crc kubenswrapper[4732]: I0131 09:10:20.542703 4732 scope.go:117] "RemoveContainer" containerID="98e5c23e9a8bde55626defc76f20f8510954c4ef79d762950e0790a7de4dce4f" Jan 31 09:10:20 crc kubenswrapper[4732]: E0131 09:10:20.543456 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4mxsr_openshift-multus(8e23192f-14db-41ef-af89-4a76e325d9c1)\"" pod="openshift-multus/multus-4mxsr" podUID="8e23192f-14db-41ef-af89-4a76e325d9c1" Jan 31 09:10:31 crc kubenswrapper[4732]: I0131 09:10:31.544327 4732 scope.go:117] "RemoveContainer" containerID="98e5c23e9a8bde55626defc76f20f8510954c4ef79d762950e0790a7de4dce4f" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.191282 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/2.log" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.192090 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/1.log" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.192210 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mxsr" event={"ID":"8e23192f-14db-41ef-af89-4a76e325d9c1","Type":"ContainerStarted","Data":"9a91cfdec82d25573bcc6a3131e5bad59d02bdc0a8b1943a4c0deb55c924fbce"} Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.564293 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v"] Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.565589 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.568298 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.576984 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v"] Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.727316 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.727367 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.727427 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j52vg\" (UniqueName: \"kubernetes.io/projected/76f99e73-f72c-4026-b43f-dcb9f20b554f-kube-api-access-j52vg\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.828567 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.828606 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.828655 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j52vg\" (UniqueName: \"kubernetes.io/projected/76f99e73-f72c-4026-b43f-dcb9f20b554f-kube-api-access-j52vg\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.829342 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.829579 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.868873 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j52vg\" (UniqueName: \"kubernetes.io/projected/76f99e73-f72c-4026-b43f-dcb9f20b554f-kube-api-access-j52vg\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: I0131 09:10:32.931749 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: E0131 09:10:32.961555 4732 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_openshift-marketplace_76f99e73-f72c-4026-b43f-dcb9f20b554f_0(24882b4407469c82f2a582f0e24282bc4819fd3a1612e09c4fe2a3dc3ca73be1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 09:10:32 crc kubenswrapper[4732]: E0131 09:10:32.961644 4732 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_openshift-marketplace_76f99e73-f72c-4026-b43f-dcb9f20b554f_0(24882b4407469c82f2a582f0e24282bc4819fd3a1612e09c4fe2a3dc3ca73be1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: E0131 09:10:32.961694 4732 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_openshift-marketplace_76f99e73-f72c-4026-b43f-dcb9f20b554f_0(24882b4407469c82f2a582f0e24282bc4819fd3a1612e09c4fe2a3dc3ca73be1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:32 crc kubenswrapper[4732]: E0131 09:10:32.961764 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_openshift-marketplace(76f99e73-f72c-4026-b43f-dcb9f20b554f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_openshift-marketplace(76f99e73-f72c-4026-b43f-dcb9f20b554f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_openshift-marketplace_76f99e73-f72c-4026-b43f-dcb9f20b554f_0(24882b4407469c82f2a582f0e24282bc4819fd3a1612e09c4fe2a3dc3ca73be1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" podUID="76f99e73-f72c-4026-b43f-dcb9f20b554f" Jan 31 09:10:33 crc kubenswrapper[4732]: I0131 09:10:33.197647 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:33 crc kubenswrapper[4732]: I0131 09:10:33.198607 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:33 crc kubenswrapper[4732]: I0131 09:10:33.615776 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v"] Jan 31 09:10:33 crc kubenswrapper[4732]: W0131 09:10:33.620055 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f99e73_f72c_4026_b43f_dcb9f20b554f.slice/crio-2775342a40bcc83af9f040ed925ff063f4750e23fdebac2a47fe9858393cb91d WatchSource:0}: Error finding container 2775342a40bcc83af9f040ed925ff063f4750e23fdebac2a47fe9858393cb91d: Status 404 returned error can't find the container with id 2775342a40bcc83af9f040ed925ff063f4750e23fdebac2a47fe9858393cb91d Jan 31 09:10:34 crc kubenswrapper[4732]: I0131 09:10:34.207742 4732 generic.go:334] "Generic (PLEG): container finished" podID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerID="de1621a69fc5d3d3456ecdb4b2b3f0c8f4de38f8b1d1931d24ca267370544f6b" exitCode=0 Jan 31 09:10:34 crc kubenswrapper[4732]: I0131 09:10:34.208079 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" event={"ID":"76f99e73-f72c-4026-b43f-dcb9f20b554f","Type":"ContainerDied","Data":"de1621a69fc5d3d3456ecdb4b2b3f0c8f4de38f8b1d1931d24ca267370544f6b"} Jan 31 09:10:34 crc kubenswrapper[4732]: I0131 09:10:34.208125 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" event={"ID":"76f99e73-f72c-4026-b43f-dcb9f20b554f","Type":"ContainerStarted","Data":"2775342a40bcc83af9f040ed925ff063f4750e23fdebac2a47fe9858393cb91d"} Jan 31 09:10:34 crc kubenswrapper[4732]: I0131 09:10:34.210622 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:10:36 crc kubenswrapper[4732]: I0131 09:10:36.222711 4732 generic.go:334] "Generic (PLEG): container finished" podID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerID="a4915083fdb38c4a25660c8d0524c37bb3cd9ac78c2e6fcedac98a56c8c39103" exitCode=0 Jan 31 09:10:36 crc kubenswrapper[4732]: I0131 09:10:36.222825 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" event={"ID":"76f99e73-f72c-4026-b43f-dcb9f20b554f","Type":"ContainerDied","Data":"a4915083fdb38c4a25660c8d0524c37bb3cd9ac78c2e6fcedac98a56c8c39103"} Jan 31 09:10:37 crc kubenswrapper[4732]: I0131 09:10:37.230183 4732 generic.go:334] "Generic (PLEG): container finished" podID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerID="da05d5940c342f32c8a6aacbf8ae5641047149294a9be2777a763f6be35278b7" exitCode=0 Jan 31 09:10:37 crc kubenswrapper[4732]: I0131 09:10:37.230293 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" event={"ID":"76f99e73-f72c-4026-b43f-dcb9f20b554f","Type":"ContainerDied","Data":"da05d5940c342f32c8a6aacbf8ae5641047149294a9be2777a763f6be35278b7"} Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.029665 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqvt7" Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.475763 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.596720 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-util\") pod \"76f99e73-f72c-4026-b43f-dcb9f20b554f\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.596816 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j52vg\" (UniqueName: \"kubernetes.io/projected/76f99e73-f72c-4026-b43f-dcb9f20b554f-kube-api-access-j52vg\") pod \"76f99e73-f72c-4026-b43f-dcb9f20b554f\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.597122 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-bundle\") pod \"76f99e73-f72c-4026-b43f-dcb9f20b554f\" (UID: \"76f99e73-f72c-4026-b43f-dcb9f20b554f\") " Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.599265 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-bundle" (OuterVolumeSpecName: "bundle") pod "76f99e73-f72c-4026-b43f-dcb9f20b554f" (UID: "76f99e73-f72c-4026-b43f-dcb9f20b554f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.601868 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76f99e73-f72c-4026-b43f-dcb9f20b554f-kube-api-access-j52vg" (OuterVolumeSpecName: "kube-api-access-j52vg") pod "76f99e73-f72c-4026-b43f-dcb9f20b554f" (UID: "76f99e73-f72c-4026-b43f-dcb9f20b554f"). InnerVolumeSpecName "kube-api-access-j52vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.611056 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-util" (OuterVolumeSpecName: "util") pod "76f99e73-f72c-4026-b43f-dcb9f20b554f" (UID: "76f99e73-f72c-4026-b43f-dcb9f20b554f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.698879 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.698921 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/76f99e73-f72c-4026-b43f-dcb9f20b554f-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:38 crc kubenswrapper[4732]: I0131 09:10:38.698930 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j52vg\" (UniqueName: \"kubernetes.io/projected/76f99e73-f72c-4026-b43f-dcb9f20b554f-kube-api-access-j52vg\") on node \"crc\" DevicePath \"\"" Jan 31 09:10:39 crc kubenswrapper[4732]: I0131 09:10:39.243234 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" event={"ID":"76f99e73-f72c-4026-b43f-dcb9f20b554f","Type":"ContainerDied","Data":"2775342a40bcc83af9f040ed925ff063f4750e23fdebac2a47fe9858393cb91d"} Jan 31 09:10:39 crc kubenswrapper[4732]: I0131 09:10:39.243279 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2775342a40bcc83af9f040ed925ff063f4750e23fdebac2a47fe9858393cb91d" Jan 31 09:10:39 crc kubenswrapper[4732]: I0131 09:10:39.243287 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v" Jan 31 09:10:47 crc kubenswrapper[4732]: I0131 09:10:47.497642 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:10:47 crc kubenswrapper[4732]: I0131 09:10:47.498230 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.638432 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn"] Jan 31 09:10:50 crc kubenswrapper[4732]: E0131 09:10:50.638914 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerName="util" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.638934 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerName="util" Jan 31 09:10:50 crc kubenswrapper[4732]: E0131 09:10:50.638954 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerName="pull" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.638960 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerName="pull" Jan 31 09:10:50 crc kubenswrapper[4732]: E0131 09:10:50.638968 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerName="extract" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.638976 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerName="extract" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.639073 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="76f99e73-f72c-4026-b43f-dcb9f20b554f" containerName="extract" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.639502 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.641785 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.642755 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.642886 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.643001 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-lg8wh" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.643159 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.644452 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ca218dd-0d42-45c8-b4e4-ca638781c915-webhook-cert\") pod \"metallb-operator-controller-manager-6d8dc66c8b-8p2mn\" (UID: \"8ca218dd-0d42-45c8-b4e4-ca638781c915\") " pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.644505 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ca218dd-0d42-45c8-b4e4-ca638781c915-apiservice-cert\") pod \"metallb-operator-controller-manager-6d8dc66c8b-8p2mn\" (UID: \"8ca218dd-0d42-45c8-b4e4-ca638781c915\") " pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.644690 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knrcd\" (UniqueName: \"kubernetes.io/projected/8ca218dd-0d42-45c8-b4e4-ca638781c915-kube-api-access-knrcd\") pod \"metallb-operator-controller-manager-6d8dc66c8b-8p2mn\" (UID: \"8ca218dd-0d42-45c8-b4e4-ca638781c915\") " pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.680169 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn"] Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.745867 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knrcd\" (UniqueName: \"kubernetes.io/projected/8ca218dd-0d42-45c8-b4e4-ca638781c915-kube-api-access-knrcd\") pod \"metallb-operator-controller-manager-6d8dc66c8b-8p2mn\" (UID: \"8ca218dd-0d42-45c8-b4e4-ca638781c915\") " pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.745935 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ca218dd-0d42-45c8-b4e4-ca638781c915-webhook-cert\") pod \"metallb-operator-controller-manager-6d8dc66c8b-8p2mn\" (UID: \"8ca218dd-0d42-45c8-b4e4-ca638781c915\") " pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.745985 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ca218dd-0d42-45c8-b4e4-ca638781c915-apiservice-cert\") pod \"metallb-operator-controller-manager-6d8dc66c8b-8p2mn\" (UID: \"8ca218dd-0d42-45c8-b4e4-ca638781c915\") " pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.752476 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8ca218dd-0d42-45c8-b4e4-ca638781c915-webhook-cert\") pod \"metallb-operator-controller-manager-6d8dc66c8b-8p2mn\" (UID: \"8ca218dd-0d42-45c8-b4e4-ca638781c915\") " pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.752510 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8ca218dd-0d42-45c8-b4e4-ca638781c915-apiservice-cert\") pod \"metallb-operator-controller-manager-6d8dc66c8b-8p2mn\" (UID: \"8ca218dd-0d42-45c8-b4e4-ca638781c915\") " pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.763433 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knrcd\" (UniqueName: \"kubernetes.io/projected/8ca218dd-0d42-45c8-b4e4-ca638781c915-kube-api-access-knrcd\") pod \"metallb-operator-controller-manager-6d8dc66c8b-8p2mn\" (UID: \"8ca218dd-0d42-45c8-b4e4-ca638781c915\") " pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:50 crc kubenswrapper[4732]: I0131 09:10:50.957702 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.034679 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf"] Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.035512 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.038699 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.039194 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.039878 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xkf4b" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.049183 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq8qj\" (UniqueName: \"kubernetes.io/projected/62f950f6-2a18-4ca6-8cdb-75f47437053a-kube-api-access-vq8qj\") pod \"metallb-operator-webhook-server-c98699f55-6bzdf\" (UID: \"62f950f6-2a18-4ca6-8cdb-75f47437053a\") " pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.049284 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/62f950f6-2a18-4ca6-8cdb-75f47437053a-apiservice-cert\") pod \"metallb-operator-webhook-server-c98699f55-6bzdf\" (UID: \"62f950f6-2a18-4ca6-8cdb-75f47437053a\") " pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.049319 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/62f950f6-2a18-4ca6-8cdb-75f47437053a-webhook-cert\") pod \"metallb-operator-webhook-server-c98699f55-6bzdf\" (UID: \"62f950f6-2a18-4ca6-8cdb-75f47437053a\") " pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.066484 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf"] Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.150137 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/62f950f6-2a18-4ca6-8cdb-75f47437053a-webhook-cert\") pod \"metallb-operator-webhook-server-c98699f55-6bzdf\" (UID: \"62f950f6-2a18-4ca6-8cdb-75f47437053a\") " pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.150242 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq8qj\" (UniqueName: \"kubernetes.io/projected/62f950f6-2a18-4ca6-8cdb-75f47437053a-kube-api-access-vq8qj\") pod \"metallb-operator-webhook-server-c98699f55-6bzdf\" (UID: \"62f950f6-2a18-4ca6-8cdb-75f47437053a\") " pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.150299 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/62f950f6-2a18-4ca6-8cdb-75f47437053a-apiservice-cert\") pod \"metallb-operator-webhook-server-c98699f55-6bzdf\" (UID: \"62f950f6-2a18-4ca6-8cdb-75f47437053a\") " pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.160644 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/62f950f6-2a18-4ca6-8cdb-75f47437053a-webhook-cert\") pod \"metallb-operator-webhook-server-c98699f55-6bzdf\" (UID: \"62f950f6-2a18-4ca6-8cdb-75f47437053a\") " pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.160653 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/62f950f6-2a18-4ca6-8cdb-75f47437053a-apiservice-cert\") pod \"metallb-operator-webhook-server-c98699f55-6bzdf\" (UID: \"62f950f6-2a18-4ca6-8cdb-75f47437053a\") " pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.176747 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq8qj\" (UniqueName: \"kubernetes.io/projected/62f950f6-2a18-4ca6-8cdb-75f47437053a-kube-api-access-vq8qj\") pod \"metallb-operator-webhook-server-c98699f55-6bzdf\" (UID: \"62f950f6-2a18-4ca6-8cdb-75f47437053a\") " pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.294367 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn"] Jan 31 09:10:51 crc kubenswrapper[4732]: W0131 09:10:51.296864 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ca218dd_0d42_45c8_b4e4_ca638781c915.slice/crio-600452c3ec84f04be30346cf95724591d530cd41a06688178f841081efd3d084 WatchSource:0}: Error finding container 600452c3ec84f04be30346cf95724591d530cd41a06688178f841081efd3d084: Status 404 returned error can't find the container with id 600452c3ec84f04be30346cf95724591d530cd41a06688178f841081efd3d084 Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.311213 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" event={"ID":"8ca218dd-0d42-45c8-b4e4-ca638781c915","Type":"ContainerStarted","Data":"600452c3ec84f04be30346cf95724591d530cd41a06688178f841081efd3d084"} Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.355031 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:51 crc kubenswrapper[4732]: I0131 09:10:51.593499 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf"] Jan 31 09:10:51 crc kubenswrapper[4732]: W0131 09:10:51.601323 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62f950f6_2a18_4ca6_8cdb_75f47437053a.slice/crio-fc52324de537b12f15e20ce968015c137d45ab24e843e12eeba34172ad28a9ec WatchSource:0}: Error finding container fc52324de537b12f15e20ce968015c137d45ab24e843e12eeba34172ad28a9ec: Status 404 returned error can't find the container with id fc52324de537b12f15e20ce968015c137d45ab24e843e12eeba34172ad28a9ec Jan 31 09:10:52 crc kubenswrapper[4732]: I0131 09:10:52.317121 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" event={"ID":"62f950f6-2a18-4ca6-8cdb-75f47437053a","Type":"ContainerStarted","Data":"fc52324de537b12f15e20ce968015c137d45ab24e843e12eeba34172ad28a9ec"} Jan 31 09:10:56 crc kubenswrapper[4732]: I0131 09:10:56.341781 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" event={"ID":"62f950f6-2a18-4ca6-8cdb-75f47437053a","Type":"ContainerStarted","Data":"309e650ae87aac0d5de670a1b9e803ac09384be7cbc1b77ae2aa2c5a6f6c2a7a"} Jan 31 09:10:56 crc kubenswrapper[4732]: I0131 09:10:56.342212 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:10:56 crc kubenswrapper[4732]: I0131 09:10:56.343144 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" event={"ID":"8ca218dd-0d42-45c8-b4e4-ca638781c915","Type":"ContainerStarted","Data":"bce5120da08ada6151297c322ec5c219501f9d3a4bdafdfe8e7be38cf56b0e5a"} Jan 31 09:10:56 crc kubenswrapper[4732]: I0131 09:10:56.343500 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:10:56 crc kubenswrapper[4732]: I0131 09:10:56.373336 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" podStartSLOduration=1.551037713 podStartE2EDuration="5.37331397s" podCreationTimestamp="2026-01-31 09:10:51 +0000 UTC" firstStartedPulling="2026-01-31 09:10:51.605829627 +0000 UTC m=+589.911705831" lastFinishedPulling="2026-01-31 09:10:55.428105884 +0000 UTC m=+593.733982088" observedRunningTime="2026-01-31 09:10:56.368053831 +0000 UTC m=+594.673930065" watchObservedRunningTime="2026-01-31 09:10:56.37331397 +0000 UTC m=+594.679190184" Jan 31 09:10:56 crc kubenswrapper[4732]: I0131 09:10:56.389538 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" podStartSLOduration=2.355171877 podStartE2EDuration="6.389513378s" podCreationTimestamp="2026-01-31 09:10:50 +0000 UTC" firstStartedPulling="2026-01-31 09:10:51.299471417 +0000 UTC m=+589.605347621" lastFinishedPulling="2026-01-31 09:10:55.333812918 +0000 UTC m=+593.639689122" observedRunningTime="2026-01-31 09:10:56.386686257 +0000 UTC m=+594.692562471" watchObservedRunningTime="2026-01-31 09:10:56.389513378 +0000 UTC m=+594.695389582" Jan 31 09:11:02 crc kubenswrapper[4732]: I0131 09:11:02.905831 4732 scope.go:117] "RemoveContainer" containerID="456969ec8d447fd7f9acd1803b222ebfb16b02c8dee1959dd936eec29aa1d617" Jan 31 09:11:03 crc kubenswrapper[4732]: I0131 09:11:03.377238 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mxsr_8e23192f-14db-41ef-af89-4a76e325d9c1/kube-multus/2.log" Jan 31 09:11:11 crc kubenswrapper[4732]: I0131 09:11:11.360526 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c98699f55-6bzdf" Jan 31 09:11:17 crc kubenswrapper[4732]: I0131 09:11:17.498132 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:11:17 crc kubenswrapper[4732]: I0131 09:11:17.498660 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:11:17 crc kubenswrapper[4732]: I0131 09:11:17.498725 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:11:17 crc kubenswrapper[4732]: I0131 09:11:17.499288 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"942a11834ff55816d19ec94b72706370701e25dcee37029bb97b73b2e3078f9b"} pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:11:17 crc kubenswrapper[4732]: I0131 09:11:17.499350 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" containerID="cri-o://942a11834ff55816d19ec94b72706370701e25dcee37029bb97b73b2e3078f9b" gracePeriod=600 Jan 31 09:11:18 crc kubenswrapper[4732]: I0131 09:11:18.457208 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d790207-d357-4b47-87bf-5b505e061820" containerID="942a11834ff55816d19ec94b72706370701e25dcee37029bb97b73b2e3078f9b" exitCode=0 Jan 31 09:11:18 crc kubenswrapper[4732]: I0131 09:11:18.457585 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerDied","Data":"942a11834ff55816d19ec94b72706370701e25dcee37029bb97b73b2e3078f9b"} Jan 31 09:11:18 crc kubenswrapper[4732]: I0131 09:11:18.457626 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"e8d3fd1eb561cfd678a2a0df1de54d984c12dc8e05f74e816b693d4b18b74a20"} Jan 31 09:11:18 crc kubenswrapper[4732]: I0131 09:11:18.457652 4732 scope.go:117] "RemoveContainer" containerID="1a1af67f6e9c90030eed50fdab77c62259e76a7813864bb504390768e9501756" Jan 31 09:11:30 crc kubenswrapper[4732]: I0131 09:11:30.960317 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6d8dc66c8b-8p2mn" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.634531 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-6xvqw"] Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.637260 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.640061 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vsvzk" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.640263 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.640385 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.644260 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt"] Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.645085 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.647929 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt"] Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.656864 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.716624 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gcmq2"] Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.717482 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.719974 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.720337 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.720906 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.722162 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-qlvvw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.748010 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-jq8g8"] Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.749369 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.752913 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.762192 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-jq8g8"] Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.764631 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-frr-conf\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.764725 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-frr-sockets\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.764760 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e27417-1fb4-4ca9-b104-d3d335370f0d-metrics-certs\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.764863 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b09b4ac-95c1-4c31-99a0-12b38c3412ae-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-5l2kt\" (UID: \"4b09b4ac-95c1-4c31-99a0-12b38c3412ae\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.764919 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-reloader\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.764945 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9xj\" (UniqueName: \"kubernetes.io/projected/66e27417-1fb4-4ca9-b104-d3d335370f0d-kube-api-access-ln9xj\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.764987 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66e27417-1fb4-4ca9-b104-d3d335370f0d-frr-startup\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.765006 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-metrics\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.765022 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sff\" (UniqueName: \"kubernetes.io/projected/4b09b4ac-95c1-4c31-99a0-12b38c3412ae-kube-api-access-57sff\") pod \"frr-k8s-webhook-server-7df86c4f6c-5l2kt\" (UID: \"4b09b4ac-95c1-4c31-99a0-12b38c3412ae\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.866723 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-metrics-certs\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.866792 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b09b4ac-95c1-4c31-99a0-12b38c3412ae-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-5l2kt\" (UID: \"4b09b4ac-95c1-4c31-99a0-12b38c3412ae\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.866823 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-metallb-excludel2\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.866853 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-memberlist\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.866878 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-reloader\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.866902 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53b5272f-ac5c-4616-a427-28fc830d7392-cert\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.866930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9xj\" (UniqueName: \"kubernetes.io/projected/66e27417-1fb4-4ca9-b104-d3d335370f0d-kube-api-access-ln9xj\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.866968 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57sff\" (UniqueName: \"kubernetes.io/projected/4b09b4ac-95c1-4c31-99a0-12b38c3412ae-kube-api-access-57sff\") pod \"frr-k8s-webhook-server-7df86c4f6c-5l2kt\" (UID: \"4b09b4ac-95c1-4c31-99a0-12b38c3412ae\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.867004 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66e27417-1fb4-4ca9-b104-d3d335370f0d-frr-startup\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.867024 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-metrics\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.867059 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b5272f-ac5c-4616-a427-28fc830d7392-metrics-certs\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.867085 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-frr-conf\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.867115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8wlp\" (UniqueName: \"kubernetes.io/projected/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-kube-api-access-c8wlp\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.867145 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58snc\" (UniqueName: \"kubernetes.io/projected/53b5272f-ac5c-4616-a427-28fc830d7392-kube-api-access-58snc\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.867171 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-frr-sockets\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.867200 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e27417-1fb4-4ca9-b104-d3d335370f0d-metrics-certs\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: E0131 09:11:31.867930 4732 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 31 09:11:31 crc kubenswrapper[4732]: E0131 09:11:31.868103 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b09b4ac-95c1-4c31-99a0-12b38c3412ae-cert podName:4b09b4ac-95c1-4c31-99a0-12b38c3412ae nodeName:}" failed. No retries permitted until 2026-01-31 09:11:32.368080103 +0000 UTC m=+630.673956397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4b09b4ac-95c1-4c31-99a0-12b38c3412ae-cert") pod "frr-k8s-webhook-server-7df86c4f6c-5l2kt" (UID: "4b09b4ac-95c1-4c31-99a0-12b38c3412ae") : secret "frr-k8s-webhook-server-cert" not found Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.869132 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-metrics\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.869152 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-frr-sockets\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.869346 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-frr-conf\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.869595 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/66e27417-1fb4-4ca9-b104-d3d335370f0d-reloader\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.869921 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/66e27417-1fb4-4ca9-b104-d3d335370f0d-frr-startup\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.887921 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/66e27417-1fb4-4ca9-b104-d3d335370f0d-metrics-certs\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.890802 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9xj\" (UniqueName: \"kubernetes.io/projected/66e27417-1fb4-4ca9-b104-d3d335370f0d-kube-api-access-ln9xj\") pod \"frr-k8s-6xvqw\" (UID: \"66e27417-1fb4-4ca9-b104-d3d335370f0d\") " pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.891699 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sff\" (UniqueName: \"kubernetes.io/projected/4b09b4ac-95c1-4c31-99a0-12b38c3412ae-kube-api-access-57sff\") pod \"frr-k8s-webhook-server-7df86c4f6c-5l2kt\" (UID: \"4b09b4ac-95c1-4c31-99a0-12b38c3412ae\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.961524 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.968719 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b5272f-ac5c-4616-a427-28fc830d7392-metrics-certs\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.968767 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8wlp\" (UniqueName: \"kubernetes.io/projected/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-kube-api-access-c8wlp\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.968795 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58snc\" (UniqueName: \"kubernetes.io/projected/53b5272f-ac5c-4616-a427-28fc830d7392-kube-api-access-58snc\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.968843 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-metrics-certs\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.968878 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-metallb-excludel2\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.968897 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-memberlist\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.968914 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53b5272f-ac5c-4616-a427-28fc830d7392-cert\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:31 crc kubenswrapper[4732]: E0131 09:11:31.969540 4732 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 09:11:31 crc kubenswrapper[4732]: E0131 09:11:31.969842 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-memberlist podName:3fbb0c82-6b72-4313-94e2-3e71d27cf75f nodeName:}" failed. No retries permitted until 2026-01-31 09:11:32.469809507 +0000 UTC m=+630.775685711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-memberlist") pod "speaker-gcmq2" (UID: "3fbb0c82-6b72-4313-94e2-3e71d27cf75f") : secret "metallb-memberlist" not found Jan 31 09:11:31 crc kubenswrapper[4732]: E0131 09:11:31.969927 4732 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 31 09:11:31 crc kubenswrapper[4732]: E0131 09:11:31.970123 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53b5272f-ac5c-4616-a427-28fc830d7392-metrics-certs podName:53b5272f-ac5c-4616-a427-28fc830d7392 nodeName:}" failed. No retries permitted until 2026-01-31 09:11:32.470108837 +0000 UTC m=+630.775985121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53b5272f-ac5c-4616-a427-28fc830d7392-metrics-certs") pod "controller-6968d8fdc4-jq8g8" (UID: "53b5272f-ac5c-4616-a427-28fc830d7392") : secret "controller-certs-secret" not found Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.970272 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-metallb-excludel2\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.972018 4732 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.973555 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-metrics-certs\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.987841 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53b5272f-ac5c-4616-a427-28fc830d7392-cert\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.990804 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8wlp\" (UniqueName: \"kubernetes.io/projected/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-kube-api-access-c8wlp\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:31 crc kubenswrapper[4732]: I0131 09:11:31.992425 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58snc\" (UniqueName: \"kubernetes.io/projected/53b5272f-ac5c-4616-a427-28fc830d7392-kube-api-access-58snc\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.373974 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b09b4ac-95c1-4c31-99a0-12b38c3412ae-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-5l2kt\" (UID: \"4b09b4ac-95c1-4c31-99a0-12b38c3412ae\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.383528 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4b09b4ac-95c1-4c31-99a0-12b38c3412ae-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-5l2kt\" (UID: \"4b09b4ac-95c1-4c31-99a0-12b38c3412ae\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.476149 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-memberlist\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.476246 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b5272f-ac5c-4616-a427-28fc830d7392-metrics-certs\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:32 crc kubenswrapper[4732]: E0131 09:11:32.476313 4732 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 09:11:32 crc kubenswrapper[4732]: E0131 09:11:32.476397 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-memberlist podName:3fbb0c82-6b72-4313-94e2-3e71d27cf75f nodeName:}" failed. No retries permitted until 2026-01-31 09:11:33.476376451 +0000 UTC m=+631.782252665 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-memberlist") pod "speaker-gcmq2" (UID: "3fbb0c82-6b72-4313-94e2-3e71d27cf75f") : secret "metallb-memberlist" not found Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.480840 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53b5272f-ac5c-4616-a427-28fc830d7392-metrics-certs\") pod \"controller-6968d8fdc4-jq8g8\" (UID: \"53b5272f-ac5c-4616-a427-28fc830d7392\") " pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.528292 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerStarted","Data":"877b0278a4b6e1b30b2daf925c6dcdeea13d341f02b35472839810c10daea2cd"} Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.569781 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.665266 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.845201 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-jq8g8"] Jan 31 09:11:32 crc kubenswrapper[4732]: W0131 09:11:32.857338 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b5272f_ac5c_4616_a427_28fc830d7392.slice/crio-9aba267b95b4b970dccb5a2b68ba16102c31510aaf08510acef1b3802eb834b1 WatchSource:0}: Error finding container 9aba267b95b4b970dccb5a2b68ba16102c31510aaf08510acef1b3802eb834b1: Status 404 returned error can't find the container with id 9aba267b95b4b970dccb5a2b68ba16102c31510aaf08510acef1b3802eb834b1 Jan 31 09:11:32 crc kubenswrapper[4732]: I0131 09:11:32.987748 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt"] Jan 31 09:11:33 crc kubenswrapper[4732]: I0131 09:11:33.487842 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-memberlist\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:33 crc kubenswrapper[4732]: I0131 09:11:33.508263 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/3fbb0c82-6b72-4313-94e2-3e71d27cf75f-memberlist\") pod \"speaker-gcmq2\" (UID: \"3fbb0c82-6b72-4313-94e2-3e71d27cf75f\") " pod="metallb-system/speaker-gcmq2" Jan 31 09:11:33 crc kubenswrapper[4732]: I0131 09:11:33.532217 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gcmq2" Jan 31 09:11:33 crc kubenswrapper[4732]: I0131 09:11:33.534820 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-jq8g8" event={"ID":"53b5272f-ac5c-4616-a427-28fc830d7392","Type":"ContainerStarted","Data":"9b64846d479ae0c930962568ce3d86480c7ebfbad126f6da8f0410d10f039d73"} Jan 31 09:11:33 crc kubenswrapper[4732]: I0131 09:11:33.534856 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-jq8g8" event={"ID":"53b5272f-ac5c-4616-a427-28fc830d7392","Type":"ContainerStarted","Data":"9aba267b95b4b970dccb5a2b68ba16102c31510aaf08510acef1b3802eb834b1"} Jan 31 09:11:33 crc kubenswrapper[4732]: I0131 09:11:33.538922 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" event={"ID":"4b09b4ac-95c1-4c31-99a0-12b38c3412ae","Type":"ContainerStarted","Data":"664da7da90ef74957fafe8fc535de0f58058c92cc3d30fbbb8064763faa14df3"} Jan 31 09:11:33 crc kubenswrapper[4732]: W0131 09:11:33.572250 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fbb0c82_6b72_4313_94e2_3e71d27cf75f.slice/crio-43bfef5c0d5360e9f8ca58a5e973bf4c974077f1d5a0239fb1fe38a2f1214a36 WatchSource:0}: Error finding container 43bfef5c0d5360e9f8ca58a5e973bf4c974077f1d5a0239fb1fe38a2f1214a36: Status 404 returned error can't find the container with id 43bfef5c0d5360e9f8ca58a5e973bf4c974077f1d5a0239fb1fe38a2f1214a36 Jan 31 09:11:34 crc kubenswrapper[4732]: I0131 09:11:34.557732 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gcmq2" event={"ID":"3fbb0c82-6b72-4313-94e2-3e71d27cf75f","Type":"ContainerStarted","Data":"24175cffda77a64be4617731ed7ada49bd086d0d60b0c4fb51dd9840bc67cd32"} Jan 31 09:11:34 crc kubenswrapper[4732]: I0131 09:11:34.558069 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gcmq2" event={"ID":"3fbb0c82-6b72-4313-94e2-3e71d27cf75f","Type":"ContainerStarted","Data":"43bfef5c0d5360e9f8ca58a5e973bf4c974077f1d5a0239fb1fe38a2f1214a36"} Jan 31 09:11:37 crc kubenswrapper[4732]: I0131 09:11:37.576476 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gcmq2" event={"ID":"3fbb0c82-6b72-4313-94e2-3e71d27cf75f","Type":"ContainerStarted","Data":"85398a4d05a049926254268f96773bb7ec09fbba31a8f09acd0c7e2e6186e5a5"} Jan 31 09:11:37 crc kubenswrapper[4732]: I0131 09:11:37.578008 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gcmq2" Jan 31 09:11:37 crc kubenswrapper[4732]: I0131 09:11:37.581597 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-jq8g8" event={"ID":"53b5272f-ac5c-4616-a427-28fc830d7392","Type":"ContainerStarted","Data":"ce64b6957d9463469d5ae7f256979aae906fbdcc5f67db85e42734c490543530"} Jan 31 09:11:37 crc kubenswrapper[4732]: I0131 09:11:37.582332 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:37 crc kubenswrapper[4732]: I0131 09:11:37.603381 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gcmq2" podStartSLOduration=3.76279723 podStartE2EDuration="6.603355935s" podCreationTimestamp="2026-01-31 09:11:31 +0000 UTC" firstStartedPulling="2026-01-31 09:11:33.828208324 +0000 UTC m=+632.134084528" lastFinishedPulling="2026-01-31 09:11:36.668767029 +0000 UTC m=+634.974643233" observedRunningTime="2026-01-31 09:11:37.598226051 +0000 UTC m=+635.904102295" watchObservedRunningTime="2026-01-31 09:11:37.603355935 +0000 UTC m=+635.909232139" Jan 31 09:11:37 crc kubenswrapper[4732]: I0131 09:11:37.621103 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-jq8g8" podStartSLOduration=2.975426903 podStartE2EDuration="6.621088863s" podCreationTimestamp="2026-01-31 09:11:31 +0000 UTC" firstStartedPulling="2026-01-31 09:11:33.015829437 +0000 UTC m=+631.321705641" lastFinishedPulling="2026-01-31 09:11:36.661491397 +0000 UTC m=+634.967367601" observedRunningTime="2026-01-31 09:11:37.616434134 +0000 UTC m=+635.922310338" watchObservedRunningTime="2026-01-31 09:11:37.621088863 +0000 UTC m=+635.926965067" Jan 31 09:11:40 crc kubenswrapper[4732]: I0131 09:11:40.597603 4732 generic.go:334] "Generic (PLEG): container finished" podID="66e27417-1fb4-4ca9-b104-d3d335370f0d" containerID="17bbc9855a2544b97a95d8f2b0fdd53c711d64083c9abcabe91ce4e910412695" exitCode=0 Jan 31 09:11:40 crc kubenswrapper[4732]: I0131 09:11:40.597716 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerDied","Data":"17bbc9855a2544b97a95d8f2b0fdd53c711d64083c9abcabe91ce4e910412695"} Jan 31 09:11:40 crc kubenswrapper[4732]: I0131 09:11:40.599753 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" event={"ID":"4b09b4ac-95c1-4c31-99a0-12b38c3412ae","Type":"ContainerStarted","Data":"ff6b38e11ad90eab008de94627816f6752ead6b47a4056f58a8636e7cbc5d5bb"} Jan 31 09:11:40 crc kubenswrapper[4732]: I0131 09:11:40.599938 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:40 crc kubenswrapper[4732]: I0131 09:11:40.639653 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" podStartSLOduration=2.9852749579999998 podStartE2EDuration="9.639626381s" podCreationTimestamp="2026-01-31 09:11:31 +0000 UTC" firstStartedPulling="2026-01-31 09:11:32.99684151 +0000 UTC m=+631.302717734" lastFinishedPulling="2026-01-31 09:11:39.651192953 +0000 UTC m=+637.957069157" observedRunningTime="2026-01-31 09:11:40.636492561 +0000 UTC m=+638.942368785" watchObservedRunningTime="2026-01-31 09:11:40.639626381 +0000 UTC m=+638.945502625" Jan 31 09:11:41 crc kubenswrapper[4732]: I0131 09:11:41.608966 4732 generic.go:334] "Generic (PLEG): container finished" podID="66e27417-1fb4-4ca9-b104-d3d335370f0d" containerID="f6b1222c50e8dc0c520cfd01300316178bbf984c000355ff97ab109558e3e51b" exitCode=0 Jan 31 09:11:41 crc kubenswrapper[4732]: I0131 09:11:41.609035 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerDied","Data":"f6b1222c50e8dc0c520cfd01300316178bbf984c000355ff97ab109558e3e51b"} Jan 31 09:11:42 crc kubenswrapper[4732]: I0131 09:11:42.618367 4732 generic.go:334] "Generic (PLEG): container finished" podID="66e27417-1fb4-4ca9-b104-d3d335370f0d" containerID="d53fa6cf9963c28e888cd5cc8e0ae0167cc35089e4775e6e47dd17b931d7e185" exitCode=0 Jan 31 09:11:42 crc kubenswrapper[4732]: I0131 09:11:42.618468 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerDied","Data":"d53fa6cf9963c28e888cd5cc8e0ae0167cc35089e4775e6e47dd17b931d7e185"} Jan 31 09:11:43 crc kubenswrapper[4732]: I0131 09:11:43.535927 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gcmq2" Jan 31 09:11:43 crc kubenswrapper[4732]: I0131 09:11:43.628271 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerStarted","Data":"8de5087c7410064cf47cb82bff42f314320a14609cbf14becd65f8b953e1d6cb"} Jan 31 09:11:43 crc kubenswrapper[4732]: I0131 09:11:43.628307 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerStarted","Data":"cf031ce87d3a9e971494476229ce8e2facfdb4482fa5e2ba8680ee58a83243c6"} Jan 31 09:11:43 crc kubenswrapper[4732]: I0131 09:11:43.628318 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerStarted","Data":"ada6df0ddf7708e4acce2218c2f10c69c07e46a95fe1694346941044582d8ced"} Jan 31 09:11:43 crc kubenswrapper[4732]: I0131 09:11:43.628326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerStarted","Data":"fa4c2e2585a2e15ab8246d8bd459938c11f8cd61eed1485893cfcf919a42705b"} Jan 31 09:11:43 crc kubenswrapper[4732]: I0131 09:11:43.628333 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerStarted","Data":"2f01cccd3bb714c4daceba533455375c146aa0f9592138bcc1240b752970604a"} Jan 31 09:11:43 crc kubenswrapper[4732]: I0131 09:11:43.628341 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6xvqw" event={"ID":"66e27417-1fb4-4ca9-b104-d3d335370f0d","Type":"ContainerStarted","Data":"19352b10d242364a4618e968d8a1321ec35b8a7b8c4123b00617e9ecf000c3db"} Jan 31 09:11:43 crc kubenswrapper[4732]: I0131 09:11:43.628438 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:43 crc kubenswrapper[4732]: I0131 09:11:43.657624 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-6xvqw" podStartSLOduration=5.150623854 podStartE2EDuration="12.657607491s" podCreationTimestamp="2026-01-31 09:11:31 +0000 UTC" firstStartedPulling="2026-01-31 09:11:32.126798019 +0000 UTC m=+630.432674213" lastFinishedPulling="2026-01-31 09:11:39.633781646 +0000 UTC m=+637.939657850" observedRunningTime="2026-01-31 09:11:43.656152644 +0000 UTC m=+641.962028908" watchObservedRunningTime="2026-01-31 09:11:43.657607491 +0000 UTC m=+641.963483705" Jan 31 09:11:46 crc kubenswrapper[4732]: I0131 09:11:46.962733 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:47 crc kubenswrapper[4732]: I0131 09:11:47.034498 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.220954 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-rdb96"] Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.221635 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-rdb96" Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.224965 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-7wfqk" Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.225387 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.226587 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.248997 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-rdb96"] Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.315753 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn9b6\" (UniqueName: \"kubernetes.io/projected/58ea6948-d466-40a4-8953-23c4043c1f38-kube-api-access-qn9b6\") pod \"mariadb-operator-index-rdb96\" (UID: \"58ea6948-d466-40a4-8953-23c4043c1f38\") " pod="openstack-operators/mariadb-operator-index-rdb96" Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.417184 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn9b6\" (UniqueName: \"kubernetes.io/projected/58ea6948-d466-40a4-8953-23c4043c1f38-kube-api-access-qn9b6\") pod \"mariadb-operator-index-rdb96\" (UID: \"58ea6948-d466-40a4-8953-23c4043c1f38\") " pod="openstack-operators/mariadb-operator-index-rdb96" Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.439619 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn9b6\" (UniqueName: \"kubernetes.io/projected/58ea6948-d466-40a4-8953-23c4043c1f38-kube-api-access-qn9b6\") pod \"mariadb-operator-index-rdb96\" (UID: \"58ea6948-d466-40a4-8953-23c4043c1f38\") " pod="openstack-operators/mariadb-operator-index-rdb96" Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.543823 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-rdb96" Jan 31 09:11:49 crc kubenswrapper[4732]: I0131 09:11:49.974476 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-rdb96"] Jan 31 09:11:49 crc kubenswrapper[4732]: W0131 09:11:49.982762 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58ea6948_d466_40a4_8953_23c4043c1f38.slice/crio-01973f9f12fcaf815714cf7f987e491baba8870f0389c2831a780a93f9a000ee WatchSource:0}: Error finding container 01973f9f12fcaf815714cf7f987e491baba8870f0389c2831a780a93f9a000ee: Status 404 returned error can't find the container with id 01973f9f12fcaf815714cf7f987e491baba8870f0389c2831a780a93f9a000ee Jan 31 09:11:50 crc kubenswrapper[4732]: I0131 09:11:50.677371 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-rdb96" event={"ID":"58ea6948-d466-40a4-8953-23c4043c1f38","Type":"ContainerStarted","Data":"01973f9f12fcaf815714cf7f987e491baba8870f0389c2831a780a93f9a000ee"} Jan 31 09:11:51 crc kubenswrapper[4732]: I0131 09:11:51.684777 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-rdb96" event={"ID":"58ea6948-d466-40a4-8953-23c4043c1f38","Type":"ContainerStarted","Data":"0b71175788c7ad27d984a970d1b34e5416424e41f381a1e72319c94a101faeb6"} Jan 31 09:11:51 crc kubenswrapper[4732]: I0131 09:11:51.701701 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-rdb96" podStartSLOduration=1.173812748 podStartE2EDuration="2.701632763s" podCreationTimestamp="2026-01-31 09:11:49 +0000 UTC" firstStartedPulling="2026-01-31 09:11:49.984692453 +0000 UTC m=+648.290568657" lastFinishedPulling="2026-01-31 09:11:51.512512418 +0000 UTC m=+649.818388672" observedRunningTime="2026-01-31 09:11:51.700042252 +0000 UTC m=+650.005918466" watchObservedRunningTime="2026-01-31 09:11:51.701632763 +0000 UTC m=+650.007508967" Jan 31 09:11:52 crc kubenswrapper[4732]: I0131 09:11:52.584585 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-5l2kt" Jan 31 09:11:52 crc kubenswrapper[4732]: I0131 09:11:52.602876 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-rdb96"] Jan 31 09:11:52 crc kubenswrapper[4732]: I0131 09:11:52.672391 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-jq8g8" Jan 31 09:11:53 crc kubenswrapper[4732]: I0131 09:11:53.206735 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-b545g"] Jan 31 09:11:53 crc kubenswrapper[4732]: I0131 09:11:53.208255 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:11:53 crc kubenswrapper[4732]: I0131 09:11:53.211133 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-b545g"] Jan 31 09:11:53 crc kubenswrapper[4732]: I0131 09:11:53.266891 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl5c8\" (UniqueName: \"kubernetes.io/projected/e502d767-ed18-4540-8d2c-ffd993e4822d-kube-api-access-jl5c8\") pod \"mariadb-operator-index-b545g\" (UID: \"e502d767-ed18-4540-8d2c-ffd993e4822d\") " pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:11:53 crc kubenswrapper[4732]: I0131 09:11:53.368357 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl5c8\" (UniqueName: \"kubernetes.io/projected/e502d767-ed18-4540-8d2c-ffd993e4822d-kube-api-access-jl5c8\") pod \"mariadb-operator-index-b545g\" (UID: \"e502d767-ed18-4540-8d2c-ffd993e4822d\") " pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:11:53 crc kubenswrapper[4732]: I0131 09:11:53.385355 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl5c8\" (UniqueName: \"kubernetes.io/projected/e502d767-ed18-4540-8d2c-ffd993e4822d-kube-api-access-jl5c8\") pod \"mariadb-operator-index-b545g\" (UID: \"e502d767-ed18-4540-8d2c-ffd993e4822d\") " pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:11:53 crc kubenswrapper[4732]: I0131 09:11:53.576183 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:11:53 crc kubenswrapper[4732]: I0131 09:11:53.699493 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-rdb96" podUID="58ea6948-d466-40a4-8953-23c4043c1f38" containerName="registry-server" containerID="cri-o://0b71175788c7ad27d984a970d1b34e5416424e41f381a1e72319c94a101faeb6" gracePeriod=2 Jan 31 09:11:54 crc kubenswrapper[4732]: I0131 09:11:54.001731 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-b545g"] Jan 31 09:11:54 crc kubenswrapper[4732]: I0131 09:11:54.710021 4732 generic.go:334] "Generic (PLEG): container finished" podID="58ea6948-d466-40a4-8953-23c4043c1f38" containerID="0b71175788c7ad27d984a970d1b34e5416424e41f381a1e72319c94a101faeb6" exitCode=0 Jan 31 09:11:54 crc kubenswrapper[4732]: I0131 09:11:54.710113 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-rdb96" event={"ID":"58ea6948-d466-40a4-8953-23c4043c1f38","Type":"ContainerDied","Data":"0b71175788c7ad27d984a970d1b34e5416424e41f381a1e72319c94a101faeb6"} Jan 31 09:11:54 crc kubenswrapper[4732]: I0131 09:11:54.711684 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-b545g" event={"ID":"e502d767-ed18-4540-8d2c-ffd993e4822d","Type":"ContainerStarted","Data":"e336ec36e7b92b98ffa9cd023af81299e0d2a21836b58f3096392afd27981f20"} Jan 31 09:11:54 crc kubenswrapper[4732]: I0131 09:11:54.937344 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-rdb96" Jan 31 09:11:54 crc kubenswrapper[4732]: I0131 09:11:54.990092 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn9b6\" (UniqueName: \"kubernetes.io/projected/58ea6948-d466-40a4-8953-23c4043c1f38-kube-api-access-qn9b6\") pod \"58ea6948-d466-40a4-8953-23c4043c1f38\" (UID: \"58ea6948-d466-40a4-8953-23c4043c1f38\") " Jan 31 09:11:54 crc kubenswrapper[4732]: I0131 09:11:54.996122 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ea6948-d466-40a4-8953-23c4043c1f38-kube-api-access-qn9b6" (OuterVolumeSpecName: "kube-api-access-qn9b6") pod "58ea6948-d466-40a4-8953-23c4043c1f38" (UID: "58ea6948-d466-40a4-8953-23c4043c1f38"). InnerVolumeSpecName "kube-api-access-qn9b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:11:55 crc kubenswrapper[4732]: I0131 09:11:55.091737 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn9b6\" (UniqueName: \"kubernetes.io/projected/58ea6948-d466-40a4-8953-23c4043c1f38-kube-api-access-qn9b6\") on node \"crc\" DevicePath \"\"" Jan 31 09:11:55 crc kubenswrapper[4732]: I0131 09:11:55.723494 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-rdb96" event={"ID":"58ea6948-d466-40a4-8953-23c4043c1f38","Type":"ContainerDied","Data":"01973f9f12fcaf815714cf7f987e491baba8870f0389c2831a780a93f9a000ee"} Jan 31 09:11:55 crc kubenswrapper[4732]: I0131 09:11:55.723565 4732 scope.go:117] "RemoveContainer" containerID="0b71175788c7ad27d984a970d1b34e5416424e41f381a1e72319c94a101faeb6" Jan 31 09:11:55 crc kubenswrapper[4732]: I0131 09:11:55.723573 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-rdb96" Jan 31 09:11:55 crc kubenswrapper[4732]: I0131 09:11:55.729430 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-b545g" event={"ID":"e502d767-ed18-4540-8d2c-ffd993e4822d","Type":"ContainerStarted","Data":"e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7"} Jan 31 09:11:55 crc kubenswrapper[4732]: I0131 09:11:55.766296 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-b545g" podStartSLOduration=2.081128973 podStartE2EDuration="2.766273931s" podCreationTimestamp="2026-01-31 09:11:53 +0000 UTC" firstStartedPulling="2026-01-31 09:11:54.012191594 +0000 UTC m=+652.318067828" lastFinishedPulling="2026-01-31 09:11:54.697336542 +0000 UTC m=+653.003212786" observedRunningTime="2026-01-31 09:11:55.751336268 +0000 UTC m=+654.057212482" watchObservedRunningTime="2026-01-31 09:11:55.766273931 +0000 UTC m=+654.072150145" Jan 31 09:11:55 crc kubenswrapper[4732]: I0131 09:11:55.771008 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-rdb96"] Jan 31 09:11:55 crc kubenswrapper[4732]: I0131 09:11:55.774311 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-rdb96"] Jan 31 09:11:56 crc kubenswrapper[4732]: I0131 09:11:56.550135 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ea6948-d466-40a4-8953-23c4043c1f38" path="/var/lib/kubelet/pods/58ea6948-d466-40a4-8953-23c4043c1f38/volumes" Jan 31 09:12:01 crc kubenswrapper[4732]: I0131 09:12:01.964957 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6xvqw" Jan 31 09:12:03 crc kubenswrapper[4732]: I0131 09:12:03.577292 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:12:03 crc kubenswrapper[4732]: I0131 09:12:03.577778 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:12:03 crc kubenswrapper[4732]: I0131 09:12:03.615631 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:12:03 crc kubenswrapper[4732]: I0131 09:12:03.814032 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.645680 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664"] Jan 31 09:12:10 crc kubenswrapper[4732]: E0131 09:12:10.646417 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ea6948-d466-40a4-8953-23c4043c1f38" containerName="registry-server" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.646439 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ea6948-d466-40a4-8953-23c4043c1f38" containerName="registry-server" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.646590 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ea6948-d466-40a4-8953-23c4043c1f38" containerName="registry-server" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.648505 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.653315 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tnztr" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.658684 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664"] Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.697852 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25s6h\" (UniqueName: \"kubernetes.io/projected/2996ae7b-aedb-4a67-a98e-b1a466347be0-kube-api-access-25s6h\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.697922 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-bundle\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.698009 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-util\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.798975 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-bundle\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.799099 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-util\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.799170 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25s6h\" (UniqueName: \"kubernetes.io/projected/2996ae7b-aedb-4a67-a98e-b1a466347be0-kube-api-access-25s6h\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.799734 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-bundle\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.799776 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-util\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.824650 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25s6h\" (UniqueName: \"kubernetes.io/projected/2996ae7b-aedb-4a67-a98e-b1a466347be0-kube-api-access-25s6h\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:10 crc kubenswrapper[4732]: I0131 09:12:10.971943 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:11 crc kubenswrapper[4732]: I0131 09:12:11.406112 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664"] Jan 31 09:12:11 crc kubenswrapper[4732]: I0131 09:12:11.859525 4732 generic.go:334] "Generic (PLEG): container finished" podID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerID="381b993ee77c54308c8a7301a323a8aafc300c3e038f7b063863ce23feb4f43f" exitCode=0 Jan 31 09:12:11 crc kubenswrapper[4732]: I0131 09:12:11.859572 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" event={"ID":"2996ae7b-aedb-4a67-a98e-b1a466347be0","Type":"ContainerDied","Data":"381b993ee77c54308c8a7301a323a8aafc300c3e038f7b063863ce23feb4f43f"} Jan 31 09:12:11 crc kubenswrapper[4732]: I0131 09:12:11.859599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" event={"ID":"2996ae7b-aedb-4a67-a98e-b1a466347be0","Type":"ContainerStarted","Data":"2ea7265b66893cb5dccecbcec37675792afbaf1dad165c06f341da7a7262a932"} Jan 31 09:12:12 crc kubenswrapper[4732]: I0131 09:12:12.871414 4732 generic.go:334] "Generic (PLEG): container finished" podID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerID="4b3511c94c5ac6d2254541268344c305851e16ace80321bcfccadaad9af571e5" exitCode=0 Jan 31 09:12:12 crc kubenswrapper[4732]: I0131 09:12:12.871462 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" event={"ID":"2996ae7b-aedb-4a67-a98e-b1a466347be0","Type":"ContainerDied","Data":"4b3511c94c5ac6d2254541268344c305851e16ace80321bcfccadaad9af571e5"} Jan 31 09:12:13 crc kubenswrapper[4732]: I0131 09:12:13.880091 4732 generic.go:334] "Generic (PLEG): container finished" podID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerID="d5192da51c40249f5f8c71160cc71f8db81535a29bd24997e64ba0998bfe2e66" exitCode=0 Jan 31 09:12:13 crc kubenswrapper[4732]: I0131 09:12:13.880137 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" event={"ID":"2996ae7b-aedb-4a67-a98e-b1a466347be0","Type":"ContainerDied","Data":"d5192da51c40249f5f8c71160cc71f8db81535a29bd24997e64ba0998bfe2e66"} Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.119118 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.262632 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-bundle\") pod \"2996ae7b-aedb-4a67-a98e-b1a466347be0\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.262810 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25s6h\" (UniqueName: \"kubernetes.io/projected/2996ae7b-aedb-4a67-a98e-b1a466347be0-kube-api-access-25s6h\") pod \"2996ae7b-aedb-4a67-a98e-b1a466347be0\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.262896 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-util\") pod \"2996ae7b-aedb-4a67-a98e-b1a466347be0\" (UID: \"2996ae7b-aedb-4a67-a98e-b1a466347be0\") " Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.263742 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-bundle" (OuterVolumeSpecName: "bundle") pod "2996ae7b-aedb-4a67-a98e-b1a466347be0" (UID: "2996ae7b-aedb-4a67-a98e-b1a466347be0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.268524 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2996ae7b-aedb-4a67-a98e-b1a466347be0-kube-api-access-25s6h" (OuterVolumeSpecName: "kube-api-access-25s6h") pod "2996ae7b-aedb-4a67-a98e-b1a466347be0" (UID: "2996ae7b-aedb-4a67-a98e-b1a466347be0"). InnerVolumeSpecName "kube-api-access-25s6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.279923 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-util" (OuterVolumeSpecName: "util") pod "2996ae7b-aedb-4a67-a98e-b1a466347be0" (UID: "2996ae7b-aedb-4a67-a98e-b1a466347be0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.364189 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.364219 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25s6h\" (UniqueName: \"kubernetes.io/projected/2996ae7b-aedb-4a67-a98e-b1a466347be0-kube-api-access-25s6h\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.364227 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2996ae7b-aedb-4a67-a98e-b1a466347be0-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.893484 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" event={"ID":"2996ae7b-aedb-4a67-a98e-b1a466347be0","Type":"ContainerDied","Data":"2ea7265b66893cb5dccecbcec37675792afbaf1dad165c06f341da7a7262a932"} Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.893539 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ea7265b66893cb5dccecbcec37675792afbaf1dad165c06f341da7a7262a932" Jan 31 09:12:15 crc kubenswrapper[4732]: I0131 09:12:15.893555 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664" Jan 31 09:12:23 crc kubenswrapper[4732]: I0131 09:12:23.992744 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq"] Jan 31 09:12:23 crc kubenswrapper[4732]: E0131 09:12:23.993890 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerName="util" Jan 31 09:12:23 crc kubenswrapper[4732]: I0131 09:12:23.993921 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerName="util" Jan 31 09:12:23 crc kubenswrapper[4732]: E0131 09:12:23.993946 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerName="extract" Jan 31 09:12:23 crc kubenswrapper[4732]: I0131 09:12:23.993963 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerName="extract" Jan 31 09:12:23 crc kubenswrapper[4732]: E0131 09:12:23.994030 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerName="pull" Jan 31 09:12:23 crc kubenswrapper[4732]: I0131 09:12:23.994053 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerName="pull" Jan 31 09:12:23 crc kubenswrapper[4732]: I0131 09:12:23.994321 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2996ae7b-aedb-4a67-a98e-b1a466347be0" containerName="extract" Jan 31 09:12:23 crc kubenswrapper[4732]: I0131 09:12:23.995255 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:23 crc kubenswrapper[4732]: I0131 09:12:23.997325 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 31 09:12:23 crc kubenswrapper[4732]: I0131 09:12:23.997577 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vzqj8" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.000247 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.001532 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq"] Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.178550 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-webhook-cert\") pod \"mariadb-operator-controller-manager-665d897fbd-rcqsq\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.178989 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jb9w\" (UniqueName: \"kubernetes.io/projected/088a3743-a071-4b0e-9cd8-66271eaeafdb-kube-api-access-9jb9w\") pod \"mariadb-operator-controller-manager-665d897fbd-rcqsq\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.179242 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-apiservice-cert\") pod \"mariadb-operator-controller-manager-665d897fbd-rcqsq\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.280032 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jb9w\" (UniqueName: \"kubernetes.io/projected/088a3743-a071-4b0e-9cd8-66271eaeafdb-kube-api-access-9jb9w\") pod \"mariadb-operator-controller-manager-665d897fbd-rcqsq\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.280473 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-apiservice-cert\") pod \"mariadb-operator-controller-manager-665d897fbd-rcqsq\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.280529 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-webhook-cert\") pod \"mariadb-operator-controller-manager-665d897fbd-rcqsq\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.286787 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-webhook-cert\") pod \"mariadb-operator-controller-manager-665d897fbd-rcqsq\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.287057 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-apiservice-cert\") pod \"mariadb-operator-controller-manager-665d897fbd-rcqsq\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.331427 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jb9w\" (UniqueName: \"kubernetes.io/projected/088a3743-a071-4b0e-9cd8-66271eaeafdb-kube-api-access-9jb9w\") pod \"mariadb-operator-controller-manager-665d897fbd-rcqsq\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:24 crc kubenswrapper[4732]: I0131 09:12:24.615153 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:25 crc kubenswrapper[4732]: I0131 09:12:25.012361 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq"] Jan 31 09:12:25 crc kubenswrapper[4732]: W0131 09:12:25.021148 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod088a3743_a071_4b0e_9cd8_66271eaeafdb.slice/crio-afc983ab3b957756283c86e705571f21643e9da88c0bed95193b050d0c42bb03 WatchSource:0}: Error finding container afc983ab3b957756283c86e705571f21643e9da88c0bed95193b050d0c42bb03: Status 404 returned error can't find the container with id afc983ab3b957756283c86e705571f21643e9da88c0bed95193b050d0c42bb03 Jan 31 09:12:25 crc kubenswrapper[4732]: I0131 09:12:25.962487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" event={"ID":"088a3743-a071-4b0e-9cd8-66271eaeafdb","Type":"ContainerStarted","Data":"afc983ab3b957756283c86e705571f21643e9da88c0bed95193b050d0c42bb03"} Jan 31 09:12:28 crc kubenswrapper[4732]: I0131 09:12:28.980450 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" event={"ID":"088a3743-a071-4b0e-9cd8-66271eaeafdb","Type":"ContainerStarted","Data":"8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62"} Jan 31 09:12:28 crc kubenswrapper[4732]: I0131 09:12:28.980892 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:29 crc kubenswrapper[4732]: I0131 09:12:29.002378 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" podStartSLOduration=2.405245742 podStartE2EDuration="6.002353908s" podCreationTimestamp="2026-01-31 09:12:23 +0000 UTC" firstStartedPulling="2026-01-31 09:12:25.023328958 +0000 UTC m=+683.329205162" lastFinishedPulling="2026-01-31 09:12:28.620437124 +0000 UTC m=+686.926313328" observedRunningTime="2026-01-31 09:12:28.996854574 +0000 UTC m=+687.302730788" watchObservedRunningTime="2026-01-31 09:12:29.002353908 +0000 UTC m=+687.308230132" Jan 31 09:12:34 crc kubenswrapper[4732]: I0131 09:12:34.619591 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:12:37 crc kubenswrapper[4732]: I0131 09:12:37.042066 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-bhhdv"] Jan 31 09:12:37 crc kubenswrapper[4732]: I0131 09:12:37.043120 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-bhhdv" Jan 31 09:12:37 crc kubenswrapper[4732]: I0131 09:12:37.048590 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-dxwnj" Jan 31 09:12:37 crc kubenswrapper[4732]: I0131 09:12:37.068741 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh5dj\" (UniqueName: \"kubernetes.io/projected/5357a934-2f68-4ec3-9ede-f29748dfe8ad-kube-api-access-jh5dj\") pod \"infra-operator-index-bhhdv\" (UID: \"5357a934-2f68-4ec3-9ede-f29748dfe8ad\") " pod="openstack-operators/infra-operator-index-bhhdv" Jan 31 09:12:37 crc kubenswrapper[4732]: I0131 09:12:37.070677 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-bhhdv"] Jan 31 09:12:37 crc kubenswrapper[4732]: I0131 09:12:37.169516 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh5dj\" (UniqueName: \"kubernetes.io/projected/5357a934-2f68-4ec3-9ede-f29748dfe8ad-kube-api-access-jh5dj\") pod \"infra-operator-index-bhhdv\" (UID: \"5357a934-2f68-4ec3-9ede-f29748dfe8ad\") " pod="openstack-operators/infra-operator-index-bhhdv" Jan 31 09:12:37 crc kubenswrapper[4732]: I0131 09:12:37.223313 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh5dj\" (UniqueName: \"kubernetes.io/projected/5357a934-2f68-4ec3-9ede-f29748dfe8ad-kube-api-access-jh5dj\") pod \"infra-operator-index-bhhdv\" (UID: \"5357a934-2f68-4ec3-9ede-f29748dfe8ad\") " pod="openstack-operators/infra-operator-index-bhhdv" Jan 31 09:12:37 crc kubenswrapper[4732]: I0131 09:12:37.359705 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-bhhdv" Jan 31 09:12:37 crc kubenswrapper[4732]: I0131 09:12:37.605417 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-bhhdv"] Jan 31 09:12:38 crc kubenswrapper[4732]: I0131 09:12:38.048209 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-bhhdv" event={"ID":"5357a934-2f68-4ec3-9ede-f29748dfe8ad","Type":"ContainerStarted","Data":"ff9087630cdd2cde2bf4883162d6628bdcdde4567cbcf522fa76d6568b74b558"} Jan 31 09:12:39 crc kubenswrapper[4732]: I0131 09:12:39.056286 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-bhhdv" event={"ID":"5357a934-2f68-4ec3-9ede-f29748dfe8ad","Type":"ContainerStarted","Data":"f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65"} Jan 31 09:12:39 crc kubenswrapper[4732]: I0131 09:12:39.076728 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-bhhdv" podStartSLOduration=1.145731429 podStartE2EDuration="2.076698833s" podCreationTimestamp="2026-01-31 09:12:37 +0000 UTC" firstStartedPulling="2026-01-31 09:12:37.608932826 +0000 UTC m=+695.914809020" lastFinishedPulling="2026-01-31 09:12:38.53990022 +0000 UTC m=+696.845776424" observedRunningTime="2026-01-31 09:12:39.068776183 +0000 UTC m=+697.374652427" watchObservedRunningTime="2026-01-31 09:12:39.076698833 +0000 UTC m=+697.382575077" Jan 31 09:12:40 crc kubenswrapper[4732]: I0131 09:12:40.014104 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-bhhdv"] Jan 31 09:12:40 crc kubenswrapper[4732]: I0131 09:12:40.611732 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-lbfxz"] Jan 31 09:12:40 crc kubenswrapper[4732]: I0131 09:12:40.612338 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:12:40 crc kubenswrapper[4732]: I0131 09:12:40.623289 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-lbfxz"] Jan 31 09:12:40 crc kubenswrapper[4732]: I0131 09:12:40.737202 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgj7k\" (UniqueName: \"kubernetes.io/projected/0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf-kube-api-access-jgj7k\") pod \"infra-operator-index-lbfxz\" (UID: \"0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf\") " pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:12:40 crc kubenswrapper[4732]: I0131 09:12:40.838601 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgj7k\" (UniqueName: \"kubernetes.io/projected/0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf-kube-api-access-jgj7k\") pod \"infra-operator-index-lbfxz\" (UID: \"0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf\") " pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:12:40 crc kubenswrapper[4732]: I0131 09:12:40.867118 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgj7k\" (UniqueName: \"kubernetes.io/projected/0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf-kube-api-access-jgj7k\") pod \"infra-operator-index-lbfxz\" (UID: \"0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf\") " pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:12:40 crc kubenswrapper[4732]: I0131 09:12:40.927308 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:12:41 crc kubenswrapper[4732]: I0131 09:12:41.068328 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-bhhdv" podUID="5357a934-2f68-4ec3-9ede-f29748dfe8ad" containerName="registry-server" containerID="cri-o://f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65" gracePeriod=2 Jan 31 09:12:41 crc kubenswrapper[4732]: I0131 09:12:41.153367 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-lbfxz"] Jan 31 09:12:41 crc kubenswrapper[4732]: I0131 09:12:41.374684 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-bhhdv" Jan 31 09:12:41 crc kubenswrapper[4732]: I0131 09:12:41.453278 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh5dj\" (UniqueName: \"kubernetes.io/projected/5357a934-2f68-4ec3-9ede-f29748dfe8ad-kube-api-access-jh5dj\") pod \"5357a934-2f68-4ec3-9ede-f29748dfe8ad\" (UID: \"5357a934-2f68-4ec3-9ede-f29748dfe8ad\") " Jan 31 09:12:41 crc kubenswrapper[4732]: I0131 09:12:41.458162 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5357a934-2f68-4ec3-9ede-f29748dfe8ad-kube-api-access-jh5dj" (OuterVolumeSpecName: "kube-api-access-jh5dj") pod "5357a934-2f68-4ec3-9ede-f29748dfe8ad" (UID: "5357a934-2f68-4ec3-9ede-f29748dfe8ad"). InnerVolumeSpecName "kube-api-access-jh5dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:12:41 crc kubenswrapper[4732]: I0131 09:12:41.554356 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh5dj\" (UniqueName: \"kubernetes.io/projected/5357a934-2f68-4ec3-9ede-f29748dfe8ad-kube-api-access-jh5dj\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.078009 4732 generic.go:334] "Generic (PLEG): container finished" podID="5357a934-2f68-4ec3-9ede-f29748dfe8ad" containerID="f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65" exitCode=0 Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.078087 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-bhhdv" event={"ID":"5357a934-2f68-4ec3-9ede-f29748dfe8ad","Type":"ContainerDied","Data":"f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65"} Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.078086 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-bhhdv" Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.078118 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-bhhdv" event={"ID":"5357a934-2f68-4ec3-9ede-f29748dfe8ad","Type":"ContainerDied","Data":"ff9087630cdd2cde2bf4883162d6628bdcdde4567cbcf522fa76d6568b74b558"} Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.078140 4732 scope.go:117] "RemoveContainer" containerID="f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65" Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.095997 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lbfxz" event={"ID":"0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf","Type":"ContainerStarted","Data":"910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f"} Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.096080 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lbfxz" event={"ID":"0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf","Type":"ContainerStarted","Data":"28d499c0e0c3ca32837e0d3a623958320049a25bd1e23f61b0a33ef2f6ce6116"} Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.113387 4732 scope.go:117] "RemoveContainer" containerID="f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65" Jan 31 09:12:42 crc kubenswrapper[4732]: E0131 09:12:42.114274 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65\": container with ID starting with f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65 not found: ID does not exist" containerID="f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65" Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.114350 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65"} err="failed to get container status \"f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65\": rpc error: code = NotFound desc = could not find container \"f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65\": container with ID starting with f2a48331f1b98db1f366e16fe90fc6e8caf17a9947d5cdc7bafb09547d877d65 not found: ID does not exist" Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.124745 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-lbfxz" podStartSLOduration=1.728920207 podStartE2EDuration="2.124729169s" podCreationTimestamp="2026-01-31 09:12:40 +0000 UTC" firstStartedPulling="2026-01-31 09:12:41.166597315 +0000 UTC m=+699.472473519" lastFinishedPulling="2026-01-31 09:12:41.562406277 +0000 UTC m=+699.868282481" observedRunningTime="2026-01-31 09:12:42.120292188 +0000 UTC m=+700.426168392" watchObservedRunningTime="2026-01-31 09:12:42.124729169 +0000 UTC m=+700.430605373" Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.140057 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-bhhdv"] Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.144275 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-bhhdv"] Jan 31 09:12:42 crc kubenswrapper[4732]: I0131 09:12:42.552763 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5357a934-2f68-4ec3-9ede-f29748dfe8ad" path="/var/lib/kubelet/pods/5357a934-2f68-4ec3-9ede-f29748dfe8ad/volumes" Jan 31 09:12:50 crc kubenswrapper[4732]: I0131 09:12:50.928311 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:12:50 crc kubenswrapper[4732]: I0131 09:12:50.929276 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:12:50 crc kubenswrapper[4732]: I0131 09:12:50.973859 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:12:51 crc kubenswrapper[4732]: I0131 09:12:51.194413 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.473395 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg"] Jan 31 09:12:52 crc kubenswrapper[4732]: E0131 09:12:52.473635 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5357a934-2f68-4ec3-9ede-f29748dfe8ad" containerName="registry-server" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.473650 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5357a934-2f68-4ec3-9ede-f29748dfe8ad" containerName="registry-server" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.473822 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5357a934-2f68-4ec3-9ede-f29748dfe8ad" containerName="registry-server" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.474755 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.477058 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tnztr" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.497605 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg"] Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.613557 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-bundle\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.613643 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-util\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.613700 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d5k6\" (UniqueName: \"kubernetes.io/projected/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-kube-api-access-6d5k6\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.714861 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-bundle\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.714929 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-util\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.714954 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d5k6\" (UniqueName: \"kubernetes.io/projected/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-kube-api-access-6d5k6\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.715891 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-bundle\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.715975 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-util\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.737015 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d5k6\" (UniqueName: \"kubernetes.io/projected/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-kube-api-access-6d5k6\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:52 crc kubenswrapper[4732]: I0131 09:12:52.804016 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:53 crc kubenswrapper[4732]: I0131 09:12:53.223096 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg"] Jan 31 09:12:53 crc kubenswrapper[4732]: W0131 09:12:53.229883 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5ae42fb_86af_4a2d_9570_b3be9f3f8f4a.slice/crio-ef05ff4b2356807e27538526c9ae1fe636dcc9b9ff7b169b5df1c7510b87eaa5 WatchSource:0}: Error finding container ef05ff4b2356807e27538526c9ae1fe636dcc9b9ff7b169b5df1c7510b87eaa5: Status 404 returned error can't find the container with id ef05ff4b2356807e27538526c9ae1fe636dcc9b9ff7b169b5df1c7510b87eaa5 Jan 31 09:12:54 crc kubenswrapper[4732]: I0131 09:12:54.184007 4732 generic.go:334] "Generic (PLEG): container finished" podID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerID="05dc77d92e7403bddf9c4216d3f9e9927975c83db475a4592b0fe670adbee115" exitCode=0 Jan 31 09:12:54 crc kubenswrapper[4732]: I0131 09:12:54.184173 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" event={"ID":"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a","Type":"ContainerDied","Data":"05dc77d92e7403bddf9c4216d3f9e9927975c83db475a4592b0fe670adbee115"} Jan 31 09:12:54 crc kubenswrapper[4732]: I0131 09:12:54.184330 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" event={"ID":"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a","Type":"ContainerStarted","Data":"ef05ff4b2356807e27538526c9ae1fe636dcc9b9ff7b169b5df1c7510b87eaa5"} Jan 31 09:12:55 crc kubenswrapper[4732]: I0131 09:12:55.194389 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" event={"ID":"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a","Type":"ContainerStarted","Data":"6e1beb997853403436be77394dbddcd82093917a762178f04e455efdc50a1f83"} Jan 31 09:12:56 crc kubenswrapper[4732]: I0131 09:12:56.205334 4732 generic.go:334] "Generic (PLEG): container finished" podID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerID="6e1beb997853403436be77394dbddcd82093917a762178f04e455efdc50a1f83" exitCode=0 Jan 31 09:12:56 crc kubenswrapper[4732]: I0131 09:12:56.205388 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" event={"ID":"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a","Type":"ContainerDied","Data":"6e1beb997853403436be77394dbddcd82093917a762178f04e455efdc50a1f83"} Jan 31 09:12:57 crc kubenswrapper[4732]: I0131 09:12:57.217708 4732 generic.go:334] "Generic (PLEG): container finished" podID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerID="a890dd33402e340e3c5622818af3229b97a815440f243204f47c951187c50dbe" exitCode=0 Jan 31 09:12:57 crc kubenswrapper[4732]: I0131 09:12:57.217828 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" event={"ID":"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a","Type":"ContainerDied","Data":"a890dd33402e340e3c5622818af3229b97a815440f243204f47c951187c50dbe"} Jan 31 09:12:58 crc kubenswrapper[4732]: I0131 09:12:58.493545 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:58 crc kubenswrapper[4732]: I0131 09:12:58.604087 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-util\") pod \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " Jan 31 09:12:58 crc kubenswrapper[4732]: I0131 09:12:58.604173 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d5k6\" (UniqueName: \"kubernetes.io/projected/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-kube-api-access-6d5k6\") pod \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " Jan 31 09:12:58 crc kubenswrapper[4732]: I0131 09:12:58.604238 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-bundle\") pod \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\" (UID: \"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a\") " Jan 31 09:12:58 crc kubenswrapper[4732]: I0131 09:12:58.606503 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-bundle" (OuterVolumeSpecName: "bundle") pod "f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" (UID: "f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:12:58 crc kubenswrapper[4732]: I0131 09:12:58.618757 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-util" (OuterVolumeSpecName: "util") pod "f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" (UID: "f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:12:58 crc kubenswrapper[4732]: I0131 09:12:58.706196 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:58 crc kubenswrapper[4732]: I0131 09:12:58.706233 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:12:59 crc kubenswrapper[4732]: I0131 09:12:59.233524 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" event={"ID":"f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a","Type":"ContainerDied","Data":"ef05ff4b2356807e27538526c9ae1fe636dcc9b9ff7b169b5df1c7510b87eaa5"} Jan 31 09:12:59 crc kubenswrapper[4732]: I0131 09:12:59.233582 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef05ff4b2356807e27538526c9ae1fe636dcc9b9ff7b169b5df1c7510b87eaa5" Jan 31 09:12:59 crc kubenswrapper[4732]: I0131 09:12:59.233697 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg" Jan 31 09:12:59 crc kubenswrapper[4732]: I0131 09:12:59.630197 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-kube-api-access-6d5k6" (OuterVolumeSpecName: "kube-api-access-6d5k6") pod "f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" (UID: "f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a"). InnerVolumeSpecName "kube-api-access-6d5k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:12:59 crc kubenswrapper[4732]: I0131 09:12:59.718896 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d5k6\" (UniqueName: \"kubernetes.io/projected/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a-kube-api-access-6d5k6\") on node \"crc\" DevicePath \"\"" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.328247 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Jan 31 09:13:10 crc kubenswrapper[4732]: E0131 09:13:10.329163 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerName="util" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.329180 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerName="util" Jan 31 09:13:10 crc kubenswrapper[4732]: E0131 09:13:10.329203 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerName="pull" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.329211 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerName="pull" Jan 31 09:13:10 crc kubenswrapper[4732]: E0131 09:13:10.329367 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerName="extract" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.329380 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerName="extract" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.329500 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" containerName="extract" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.330852 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.336496 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openstack-scripts" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.340151 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"kube-root-ca.crt" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.340472 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"galera-openstack-dockercfg-4btb4" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.340579 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openstack-config-data" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.340691 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openshift-service-ca.crt" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.343255 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.344245 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.349362 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.364139 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.365623 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.374355 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.385639 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.475801 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnkrx\" (UniqueName: \"kubernetes.io/projected/616eedfe-830a-4ca8-9c42-a2cfd9352312-kube-api-access-jnkrx\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.475837 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-default\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.475862 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kolla-config\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.475881 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-operator-scripts\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.475906 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-kolla-config\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.475923 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476004 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476057 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjncs\" (UniqueName: \"kubernetes.io/projected/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kube-api-access-gjncs\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476095 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-operator-scripts\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476134 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-generated\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476195 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476239 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-generated\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476288 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xg64\" (UniqueName: \"kubernetes.io/projected/0682a582-79d6-4286-9a43-e4a258dde73f-kube-api-access-9xg64\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476363 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476392 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-kolla-config\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476446 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-default\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.476471 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-default\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578149 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xg64\" (UniqueName: \"kubernetes.io/projected/0682a582-79d6-4286-9a43-e4a258dde73f-kube-api-access-9xg64\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578214 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578236 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-kolla-config\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578270 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-default\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578295 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-default\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578336 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnkrx\" (UniqueName: \"kubernetes.io/projected/616eedfe-830a-4ca8-9c42-a2cfd9352312-kube-api-access-jnkrx\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578359 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-default\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578381 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kolla-config\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578407 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-operator-scripts\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578439 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-kolla-config\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578464 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578488 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578509 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjncs\" (UniqueName: \"kubernetes.io/projected/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kube-api-access-gjncs\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-operator-scripts\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578555 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-generated\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578576 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578606 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.578644 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-generated\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.579108 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-kolla-config\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.579148 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-generated\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.579249 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-default\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.579449 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") device mount path \"/mnt/openstack/pv01\"" pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.579809 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-generated\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.580243 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-default\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.580273 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-kolla-config\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.580395 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") device mount path \"/mnt/openstack/pv05\"" pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.580576 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-operator-scripts\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.580685 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kolla-config\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.580845 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-default\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.581006 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-generated\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.581008 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") device mount path \"/mnt/openstack/pv04\"" pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.581974 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-operator-scripts\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.582867 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-operator-scripts\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.599278 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.600037 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.606120 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xg64\" (UniqueName: \"kubernetes.io/projected/0682a582-79d6-4286-9a43-e4a258dde73f-kube-api-access-9xg64\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.606171 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjncs\" (UniqueName: \"kubernetes.io/projected/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kube-api-access-gjncs\") pod \"openstack-galera-2\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.609401 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-1\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.610975 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnkrx\" (UniqueName: \"kubernetes.io/projected/616eedfe-830a-4ca8-9c42-a2cfd9352312-kube-api-access-jnkrx\") pod \"openstack-galera-0\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.659755 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.665351 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.685371 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.890567 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms"] Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.892072 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.895220 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.895571 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8kdwd" Jan 31 09:13:10 crc kubenswrapper[4732]: I0131 09:13:10.916057 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms"] Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.018997 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp7fg\" (UniqueName: \"kubernetes.io/projected/415e013e-ff2f-47b6-a17d-c0ba8f80071a-kube-api-access-vp7fg\") pod \"infra-operator-controller-manager-7f868546f6-qkfms\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.019055 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-webhook-cert\") pod \"infra-operator-controller-manager-7f868546f6-qkfms\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.019087 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-apiservice-cert\") pod \"infra-operator-controller-manager-7f868546f6-qkfms\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.093391 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Jan 31 09:13:11 crc kubenswrapper[4732]: W0131 09:13:11.099478 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7eb0179_b292_4a09_a07d_3d9bfe7978f3.slice/crio-a51f6b605ece49b9fb83731b6d6e01366d3ebfefee0aa6fe01b529f67148475c WatchSource:0}: Error finding container a51f6b605ece49b9fb83731b6d6e01366d3ebfefee0aa6fe01b529f67148475c: Status 404 returned error can't find the container with id a51f6b605ece49b9fb83731b6d6e01366d3ebfefee0aa6fe01b529f67148475c Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.120756 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-webhook-cert\") pod \"infra-operator-controller-manager-7f868546f6-qkfms\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.120809 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-apiservice-cert\") pod \"infra-operator-controller-manager-7f868546f6-qkfms\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.120871 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp7fg\" (UniqueName: \"kubernetes.io/projected/415e013e-ff2f-47b6-a17d-c0ba8f80071a-kube-api-access-vp7fg\") pod \"infra-operator-controller-manager-7f868546f6-qkfms\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.126212 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-apiservice-cert\") pod \"infra-operator-controller-manager-7f868546f6-qkfms\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.126324 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-webhook-cert\") pod \"infra-operator-controller-manager-7f868546f6-qkfms\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.138005 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp7fg\" (UniqueName: \"kubernetes.io/projected/415e013e-ff2f-47b6-a17d-c0ba8f80071a-kube-api-access-vp7fg\") pod \"infra-operator-controller-manager-7f868546f6-qkfms\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.223068 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.243800 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.266215 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Jan 31 09:13:11 crc kubenswrapper[4732]: W0131 09:13:11.279610 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0682a582_79d6_4286_9a43_e4a258dde73f.slice/crio-e0837550f45b62d919804bb06cc6ab1d6a363eb752c3707a21645f710833b1bc WatchSource:0}: Error finding container e0837550f45b62d919804bb06cc6ab1d6a363eb752c3707a21645f710833b1bc: Status 404 returned error can't find the container with id e0837550f45b62d919804bb06cc6ab1d6a363eb752c3707a21645f710833b1bc Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.308614 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"f7eb0179-b292-4a09-a07d-3d9bfe7978f3","Type":"ContainerStarted","Data":"a51f6b605ece49b9fb83731b6d6e01366d3ebfefee0aa6fe01b529f67148475c"} Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.310122 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"616eedfe-830a-4ca8-9c42-a2cfd9352312","Type":"ContainerStarted","Data":"2da8aa3ed8596e5beb1462be0a364f515a0e7e35f648a1fd6f1e41dd41f084dd"} Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.311260 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"0682a582-79d6-4286-9a43-e4a258dde73f","Type":"ContainerStarted","Data":"e0837550f45b62d919804bb06cc6ab1d6a363eb752c3707a21645f710833b1bc"} Jan 31 09:13:11 crc kubenswrapper[4732]: I0131 09:13:11.428014 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms"] Jan 31 09:13:12 crc kubenswrapper[4732]: I0131 09:13:12.319673 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" event={"ID":"415e013e-ff2f-47b6-a17d-c0ba8f80071a","Type":"ContainerStarted","Data":"ba8a3b878f56e9627430ea250d8fbd2f4a60f48d4ce391e390485cb7a6931e6c"} Jan 31 09:13:17 crc kubenswrapper[4732]: I0131 09:13:17.497849 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:13:17 crc kubenswrapper[4732]: I0131 09:13:17.498422 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:13:19 crc kubenswrapper[4732]: I0131 09:13:19.367307 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" event={"ID":"415e013e-ff2f-47b6-a17d-c0ba8f80071a","Type":"ContainerStarted","Data":"b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22"} Jan 31 09:13:19 crc kubenswrapper[4732]: I0131 09:13:19.368878 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:19 crc kubenswrapper[4732]: I0131 09:13:19.397581 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" podStartSLOduration=1.721660443 podStartE2EDuration="9.397560247s" podCreationTimestamp="2026-01-31 09:13:10 +0000 UTC" firstStartedPulling="2026-01-31 09:13:11.441190366 +0000 UTC m=+729.747066570" lastFinishedPulling="2026-01-31 09:13:19.11709017 +0000 UTC m=+737.422966374" observedRunningTime="2026-01-31 09:13:19.386062478 +0000 UTC m=+737.691938682" watchObservedRunningTime="2026-01-31 09:13:19.397560247 +0000 UTC m=+737.703436451" Jan 31 09:13:20 crc kubenswrapper[4732]: I0131 09:13:20.373296 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"0682a582-79d6-4286-9a43-e4a258dde73f","Type":"ContainerStarted","Data":"ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279"} Jan 31 09:13:20 crc kubenswrapper[4732]: I0131 09:13:20.374438 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"f7eb0179-b292-4a09-a07d-3d9bfe7978f3","Type":"ContainerStarted","Data":"e5b6220dc9c37c2ae0ded221270ed7b6d934a81efa02b526750d6847a367f930"} Jan 31 09:13:20 crc kubenswrapper[4732]: I0131 09:13:20.375411 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"616eedfe-830a-4ca8-9c42-a2cfd9352312","Type":"ContainerStarted","Data":"492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4"} Jan 31 09:13:23 crc kubenswrapper[4732]: E0131 09:13:23.172889 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7eb0179_b292_4a09_a07d_3d9bfe7978f3.slice/crio-e5b6220dc9c37c2ae0ded221270ed7b6d934a81efa02b526750d6847a367f930.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7eb0179_b292_4a09_a07d_3d9bfe7978f3.slice/crio-conmon-e5b6220dc9c37c2ae0ded221270ed7b6d934a81efa02b526750d6847a367f930.scope\": RecentStats: unable to find data in memory cache]" Jan 31 09:13:23 crc kubenswrapper[4732]: I0131 09:13:23.400818 4732 generic.go:334] "Generic (PLEG): container finished" podID="f7eb0179-b292-4a09-a07d-3d9bfe7978f3" containerID="e5b6220dc9c37c2ae0ded221270ed7b6d934a81efa02b526750d6847a367f930" exitCode=0 Jan 31 09:13:23 crc kubenswrapper[4732]: I0131 09:13:23.401334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"f7eb0179-b292-4a09-a07d-3d9bfe7978f3","Type":"ContainerDied","Data":"e5b6220dc9c37c2ae0ded221270ed7b6d934a81efa02b526750d6847a367f930"} Jan 31 09:13:23 crc kubenswrapper[4732]: I0131 09:13:23.409270 4732 generic.go:334] "Generic (PLEG): container finished" podID="616eedfe-830a-4ca8-9c42-a2cfd9352312" containerID="492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4" exitCode=0 Jan 31 09:13:23 crc kubenswrapper[4732]: I0131 09:13:23.409437 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"616eedfe-830a-4ca8-9c42-a2cfd9352312","Type":"ContainerDied","Data":"492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4"} Jan 31 09:13:23 crc kubenswrapper[4732]: I0131 09:13:23.417120 4732 generic.go:334] "Generic (PLEG): container finished" podID="0682a582-79d6-4286-9a43-e4a258dde73f" containerID="ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279" exitCode=0 Jan 31 09:13:23 crc kubenswrapper[4732]: I0131 09:13:23.417197 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"0682a582-79d6-4286-9a43-e4a258dde73f","Type":"ContainerDied","Data":"ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279"} Jan 31 09:13:24 crc kubenswrapper[4732]: I0131 09:13:24.432556 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"0682a582-79d6-4286-9a43-e4a258dde73f","Type":"ContainerStarted","Data":"ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e"} Jan 31 09:13:24 crc kubenswrapper[4732]: I0131 09:13:24.436735 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"f7eb0179-b292-4a09-a07d-3d9bfe7978f3","Type":"ContainerStarted","Data":"bf4dd15bf1d89a707a1cc67b69c360511a62db3a9d76bcd538b729b470197a50"} Jan 31 09:13:24 crc kubenswrapper[4732]: I0131 09:13:24.443194 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"616eedfe-830a-4ca8-9c42-a2cfd9352312","Type":"ContainerStarted","Data":"8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511"} Jan 31 09:13:24 crc kubenswrapper[4732]: I0131 09:13:24.451527 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-1" podStartSLOduration=7.446527402 podStartE2EDuration="15.451507717s" podCreationTimestamp="2026-01-31 09:13:09 +0000 UTC" firstStartedPulling="2026-01-31 09:13:11.282872424 +0000 UTC m=+729.588748618" lastFinishedPulling="2026-01-31 09:13:19.287852739 +0000 UTC m=+737.593728933" observedRunningTime="2026-01-31 09:13:24.44935788 +0000 UTC m=+742.755234084" watchObservedRunningTime="2026-01-31 09:13:24.451507717 +0000 UTC m=+742.757383921" Jan 31 09:13:24 crc kubenswrapper[4732]: I0131 09:13:24.477902 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-2" podStartSLOduration=7.25002611 podStartE2EDuration="15.477885148s" podCreationTimestamp="2026-01-31 09:13:09 +0000 UTC" firstStartedPulling="2026-01-31 09:13:11.101560606 +0000 UTC m=+729.407436810" lastFinishedPulling="2026-01-31 09:13:19.329419644 +0000 UTC m=+737.635295848" observedRunningTime="2026-01-31 09:13:24.474608046 +0000 UTC m=+742.780484250" watchObservedRunningTime="2026-01-31 09:13:24.477885148 +0000 UTC m=+742.783761352" Jan 31 09:13:24 crc kubenswrapper[4732]: I0131 09:13:24.494925 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-0" podStartSLOduration=7.47851855 podStartE2EDuration="15.494909539s" podCreationTimestamp="2026-01-31 09:13:09 +0000 UTC" firstStartedPulling="2026-01-31 09:13:11.261946043 +0000 UTC m=+729.567822247" lastFinishedPulling="2026-01-31 09:13:19.278337032 +0000 UTC m=+737.584213236" observedRunningTime="2026-01-31 09:13:24.49302585 +0000 UTC m=+742.798902054" watchObservedRunningTime="2026-01-31 09:13:24.494909539 +0000 UTC m=+742.800785753" Jan 31 09:13:24 crc kubenswrapper[4732]: E0131 09:13:24.577556 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:24 crc kubenswrapper[4732]: E0131 09:13:24.589705 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:24 crc kubenswrapper[4732]: E0131 09:13:24.602732 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:27 crc kubenswrapper[4732]: E0131 09:13:27.680270 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:27 crc kubenswrapper[4732]: E0131 09:13:27.703570 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:27 crc kubenswrapper[4732]: E0131 09:13:27.723524 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:30 crc kubenswrapper[4732]: I0131 09:13:30.660912 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:30 crc kubenswrapper[4732]: I0131 09:13:30.662485 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:30 crc kubenswrapper[4732]: I0131 09:13:30.665767 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:30 crc kubenswrapper[4732]: I0131 09:13:30.665967 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:13:30 crc kubenswrapper[4732]: I0131 09:13:30.686482 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:30 crc kubenswrapper[4732]: I0131 09:13:30.686851 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:30 crc kubenswrapper[4732]: E0131 09:13:30.767573 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:30 crc kubenswrapper[4732]: E0131 09:13:30.784296 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:30 crc kubenswrapper[4732]: E0131 09:13:30.802327 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:31 crc kubenswrapper[4732]: I0131 09:13:31.228112 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.084491 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/memcached-0"] Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.085193 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.087217 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"memcached-memcached-dockercfg-grf6g" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.087377 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"memcached-config-data" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.096245 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/memcached-0"] Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.240253 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-kolla-config\") pod \"memcached-0\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.240381 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvmz6\" (UniqueName: \"kubernetes.io/projected/c0d4fa62-a33c-4ab2-a446-697994c1541e-kube-api-access-rvmz6\") pod \"memcached-0\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.240416 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-config-data\") pod \"memcached-0\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.341783 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvmz6\" (UniqueName: \"kubernetes.io/projected/c0d4fa62-a33c-4ab2-a446-697994c1541e-kube-api-access-rvmz6\") pod \"memcached-0\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.341951 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-config-data\") pod \"memcached-0\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.342073 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-kolla-config\") pod \"memcached-0\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.343049 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-kolla-config\") pod \"memcached-0\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.343181 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-config-data\") pod \"memcached-0\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.365524 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvmz6\" (UniqueName: \"kubernetes.io/projected/c0d4fa62-a33c-4ab2-a446-697994c1541e-kube-api-access-rvmz6\") pod \"memcached-0\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.401677 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.635197 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/memcached-0"] Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.845986 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lrnjt"] Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.846739 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.849043 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-dzl44" Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.860508 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lrnjt"] Jan 31 09:13:32 crc kubenswrapper[4732]: I0131 09:13:32.951146 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5gcb\" (UniqueName: \"kubernetes.io/projected/908a1c63-d3ff-4714-a13a-788ad05f37f7-kube-api-access-h5gcb\") pod \"rabbitmq-cluster-operator-index-lrnjt\" (UID: \"908a1c63-d3ff-4714-a13a-788ad05f37f7\") " pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" Jan 31 09:13:33 crc kubenswrapper[4732]: I0131 09:13:33.052831 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5gcb\" (UniqueName: \"kubernetes.io/projected/908a1c63-d3ff-4714-a13a-788ad05f37f7-kube-api-access-h5gcb\") pod \"rabbitmq-cluster-operator-index-lrnjt\" (UID: \"908a1c63-d3ff-4714-a13a-788ad05f37f7\") " pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" Jan 31 09:13:33 crc kubenswrapper[4732]: I0131 09:13:33.069397 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5gcb\" (UniqueName: \"kubernetes.io/projected/908a1c63-d3ff-4714-a13a-788ad05f37f7-kube-api-access-h5gcb\") pod \"rabbitmq-cluster-operator-index-lrnjt\" (UID: \"908a1c63-d3ff-4714-a13a-788ad05f37f7\") " pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" Jan 31 09:13:33 crc kubenswrapper[4732]: I0131 09:13:33.164397 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" Jan 31 09:13:33 crc kubenswrapper[4732]: I0131 09:13:33.521655 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"c0d4fa62-a33c-4ab2-a446-697994c1541e","Type":"ContainerStarted","Data":"d82037df40928405007453003da8d4a3924f1437dbd107f02f6b845354809fb0"} Jan 31 09:13:33 crc kubenswrapper[4732]: I0131 09:13:33.589200 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lrnjt"] Jan 31 09:13:33 crc kubenswrapper[4732]: W0131 09:13:33.597534 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod908a1c63_d3ff_4714_a13a_788ad05f37f7.slice/crio-b0f9604746eebb998ff2881e738d249d0d1324f93e1f0daf2a43d6211251d2fd WatchSource:0}: Error finding container b0f9604746eebb998ff2881e738d249d0d1324f93e1f0daf2a43d6211251d2fd: Status 404 returned error can't find the container with id b0f9604746eebb998ff2881e738d249d0d1324f93e1f0daf2a43d6211251d2fd Jan 31 09:13:33 crc kubenswrapper[4732]: E0131 09:13:33.840264 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:33 crc kubenswrapper[4732]: E0131 09:13:33.852550 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:33 crc kubenswrapper[4732]: E0131 09:13:33.867532 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:34 crc kubenswrapper[4732]: I0131 09:13:34.529012 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" event={"ID":"908a1c63-d3ff-4714-a13a-788ad05f37f7","Type":"ContainerStarted","Data":"b0f9604746eebb998ff2881e738d249d0d1324f93e1f0daf2a43d6211251d2fd"} Jan 31 09:13:35 crc kubenswrapper[4732]: I0131 09:13:35.537457 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"c0d4fa62-a33c-4ab2-a446-697994c1541e","Type":"ContainerStarted","Data":"3bead2528c2ca5acd2f850812b4f5c0dc70486b1bf295a4a4a607b3a99c2b59e"} Jan 31 09:13:35 crc kubenswrapper[4732]: I0131 09:13:35.538924 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:35 crc kubenswrapper[4732]: I0131 09:13:35.561216 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/memcached-0" podStartSLOduration=1.620183739 podStartE2EDuration="3.561090059s" podCreationTimestamp="2026-01-31 09:13:32 +0000 UTC" firstStartedPulling="2026-01-31 09:13:32.639237214 +0000 UTC m=+750.945113418" lastFinishedPulling="2026-01-31 09:13:34.580143524 +0000 UTC m=+752.886019738" observedRunningTime="2026-01-31 09:13:35.557423006 +0000 UTC m=+753.863299220" watchObservedRunningTime="2026-01-31 09:13:35.561090059 +0000 UTC m=+753.866966273" Jan 31 09:13:36 crc kubenswrapper[4732]: E0131 09:13:36.937928 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:36 crc kubenswrapper[4732]: E0131 09:13:36.963735 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:36 crc kubenswrapper[4732]: E0131 09:13:36.988073 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:37 crc kubenswrapper[4732]: I0131 09:13:37.034148 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lrnjt"] Jan 31 09:13:37 crc kubenswrapper[4732]: I0131 09:13:37.647582 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-hbtxq"] Jan 31 09:13:37 crc kubenswrapper[4732]: I0131 09:13:37.648865 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:13:37 crc kubenswrapper[4732]: I0131 09:13:37.665890 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-hbtxq"] Jan 31 09:13:37 crc kubenswrapper[4732]: I0131 09:13:37.712951 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdnc7\" (UniqueName: \"kubernetes.io/projected/9ff1fd2b-a8cb-4050-9a54-3117be6964ce-kube-api-access-tdnc7\") pod \"rabbitmq-cluster-operator-index-hbtxq\" (UID: \"9ff1fd2b-a8cb-4050-9a54-3117be6964ce\") " pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:13:37 crc kubenswrapper[4732]: I0131 09:13:37.814093 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdnc7\" (UniqueName: \"kubernetes.io/projected/9ff1fd2b-a8cb-4050-9a54-3117be6964ce-kube-api-access-tdnc7\") pod \"rabbitmq-cluster-operator-index-hbtxq\" (UID: \"9ff1fd2b-a8cb-4050-9a54-3117be6964ce\") " pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:13:37 crc kubenswrapper[4732]: I0131 09:13:37.831246 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdnc7\" (UniqueName: \"kubernetes.io/projected/9ff1fd2b-a8cb-4050-9a54-3117be6964ce-kube-api-access-tdnc7\") pod \"rabbitmq-cluster-operator-index-hbtxq\" (UID: \"9ff1fd2b-a8cb-4050-9a54-3117be6964ce\") " pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:13:37 crc kubenswrapper[4732]: I0131 09:13:37.966038 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:13:38 crc kubenswrapper[4732]: I0131 09:13:38.480270 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-hbtxq"] Jan 31 09:13:38 crc kubenswrapper[4732]: W0131 09:13:38.483392 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff1fd2b_a8cb_4050_9a54_3117be6964ce.slice/crio-5f27d35f0d0db33259c32fd8459b7826ea94c3c12989e06661b903a56351ab64 WatchSource:0}: Error finding container 5f27d35f0d0db33259c32fd8459b7826ea94c3c12989e06661b903a56351ab64: Status 404 returned error can't find the container with id 5f27d35f0d0db33259c32fd8459b7826ea94c3c12989e06661b903a56351ab64 Jan 31 09:13:38 crc kubenswrapper[4732]: I0131 09:13:38.562535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" event={"ID":"908a1c63-d3ff-4714-a13a-788ad05f37f7","Type":"ContainerStarted","Data":"b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da"} Jan 31 09:13:38 crc kubenswrapper[4732]: I0131 09:13:38.562615 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" podUID="908a1c63-d3ff-4714-a13a-788ad05f37f7" containerName="registry-server" containerID="cri-o://b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da" gracePeriod=2 Jan 31 09:13:38 crc kubenswrapper[4732]: I0131 09:13:38.563590 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" event={"ID":"9ff1fd2b-a8cb-4050-9a54-3117be6964ce","Type":"ContainerStarted","Data":"5f27d35f0d0db33259c32fd8459b7826ea94c3c12989e06661b903a56351ab64"} Jan 31 09:13:38 crc kubenswrapper[4732]: I0131 09:13:38.587993 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" podStartSLOduration=2.79704732 podStartE2EDuration="6.587970077s" podCreationTimestamp="2026-01-31 09:13:32 +0000 UTC" firstStartedPulling="2026-01-31 09:13:33.601380445 +0000 UTC m=+751.907256649" lastFinishedPulling="2026-01-31 09:13:37.392303202 +0000 UTC m=+755.698179406" observedRunningTime="2026-01-31 09:13:38.586163241 +0000 UTC m=+756.892039455" watchObservedRunningTime="2026-01-31 09:13:38.587970077 +0000 UTC m=+756.893846281" Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.053499 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.136272 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5gcb\" (UniqueName: \"kubernetes.io/projected/908a1c63-d3ff-4714-a13a-788ad05f37f7-kube-api-access-h5gcb\") pod \"908a1c63-d3ff-4714-a13a-788ad05f37f7\" (UID: \"908a1c63-d3ff-4714-a13a-788ad05f37f7\") " Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.144991 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/908a1c63-d3ff-4714-a13a-788ad05f37f7-kube-api-access-h5gcb" (OuterVolumeSpecName: "kube-api-access-h5gcb") pod "908a1c63-d3ff-4714-a13a-788ad05f37f7" (UID: "908a1c63-d3ff-4714-a13a-788ad05f37f7"). InnerVolumeSpecName "kube-api-access-h5gcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.238472 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5gcb\" (UniqueName: \"kubernetes.io/projected/908a1c63-d3ff-4714-a13a-788ad05f37f7-kube-api-access-h5gcb\") on node \"crc\" DevicePath \"\"" Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.570630 4732 generic.go:334] "Generic (PLEG): container finished" podID="908a1c63-d3ff-4714-a13a-788ad05f37f7" containerID="b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da" exitCode=0 Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.570698 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.570698 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" event={"ID":"908a1c63-d3ff-4714-a13a-788ad05f37f7","Type":"ContainerDied","Data":"b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da"} Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.570773 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-lrnjt" event={"ID":"908a1c63-d3ff-4714-a13a-788ad05f37f7","Type":"ContainerDied","Data":"b0f9604746eebb998ff2881e738d249d0d1324f93e1f0daf2a43d6211251d2fd"} Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.570795 4732 scope.go:117] "RemoveContainer" containerID="b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da" Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.573004 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" event={"ID":"9ff1fd2b-a8cb-4050-9a54-3117be6964ce","Type":"ContainerStarted","Data":"fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0"} Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.599116 4732 scope.go:117] "RemoveContainer" containerID="b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da" Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.599082 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" podStartSLOduration=2.186065468 podStartE2EDuration="2.599061663s" podCreationTimestamp="2026-01-31 09:13:37 +0000 UTC" firstStartedPulling="2026-01-31 09:13:38.487004892 +0000 UTC m=+756.792881096" lastFinishedPulling="2026-01-31 09:13:38.900001087 +0000 UTC m=+757.205877291" observedRunningTime="2026-01-31 09:13:39.598823685 +0000 UTC m=+757.904699889" watchObservedRunningTime="2026-01-31 09:13:39.599061663 +0000 UTC m=+757.904937867" Jan 31 09:13:39 crc kubenswrapper[4732]: E0131 09:13:39.601136 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da\": container with ID starting with b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da not found: ID does not exist" containerID="b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da" Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.601186 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da"} err="failed to get container status \"b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da\": rpc error: code = NotFound desc = could not find container \"b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da\": container with ID starting with b6942d86c2492a85d9f2a17fb0552e77a713b2c97f4c20d64ac38f0270e022da not found: ID does not exist" Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.613366 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lrnjt"] Jan 31 09:13:39 crc kubenswrapper[4732]: I0131 09:13:39.622207 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-lrnjt"] Jan 31 09:13:40 crc kubenswrapper[4732]: E0131 09:13:40.025461 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:40 crc kubenswrapper[4732]: E0131 09:13:40.037161 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:40 crc kubenswrapper[4732]: E0131 09:13:40.050497 4732 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=6255804368796246868, SKID=, AKID=4E:48:BE:F2:C8:02:40:69:34:FD:3B:08:7C:67:11:C4:20:0F:82:02 failed: x509: certificate signed by unknown authority" Jan 31 09:13:40 crc kubenswrapper[4732]: I0131 09:13:40.550121 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="908a1c63-d3ff-4714-a13a-788ad05f37f7" path="/var/lib/kubelet/pods/908a1c63-d3ff-4714-a13a-788ad05f37f7/volumes" Jan 31 09:13:41 crc kubenswrapper[4732]: I0131 09:13:41.615949 4732 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 09:13:42 crc kubenswrapper[4732]: I0131 09:13:42.403820 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/memcached-0" Jan 31 09:13:47 crc kubenswrapper[4732]: I0131 09:13:47.497875 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:13:47 crc kubenswrapper[4732]: I0131 09:13:47.499420 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:13:47 crc kubenswrapper[4732]: I0131 09:13:47.967020 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:13:47 crc kubenswrapper[4732]: I0131 09:13:47.967374 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:13:48 crc kubenswrapper[4732]: I0131 09:13:48.009470 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:13:48 crc kubenswrapper[4732]: I0131 09:13:48.667683 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:13:48 crc kubenswrapper[4732]: I0131 09:13:48.877468 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:48 crc kubenswrapper[4732]: I0131 09:13:48.973502 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:13:49 crc kubenswrapper[4732]: E0131 09:13:49.132787 4732 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.129.56.231:56272->38.129.56.231:32957: read tcp 38.129.56.231:56272->38.129.56.231:32957: read: connection reset by peer Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.414128 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/root-account-create-update-pjmfd"] Jan 31 09:13:49 crc kubenswrapper[4732]: E0131 09:13:49.414462 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="908a1c63-d3ff-4714-a13a-788ad05f37f7" containerName="registry-server" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.414482 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="908a1c63-d3ff-4714-a13a-788ad05f37f7" containerName="registry-server" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.414732 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="908a1c63-d3ff-4714-a13a-788ad05f37f7" containerName="registry-server" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.415359 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.419495 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/root-account-create-update-pjmfd"] Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.420230 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.503914 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d8a630-8f89-44aa-9f24-2f1b279cccfd-operator-scripts\") pod \"root-account-create-update-pjmfd\" (UID: \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\") " pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.504079 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s867q\" (UniqueName: \"kubernetes.io/projected/51d8a630-8f89-44aa-9f24-2f1b279cccfd-kube-api-access-s867q\") pod \"root-account-create-update-pjmfd\" (UID: \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\") " pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.605580 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s867q\" (UniqueName: \"kubernetes.io/projected/51d8a630-8f89-44aa-9f24-2f1b279cccfd-kube-api-access-s867q\") pod \"root-account-create-update-pjmfd\" (UID: \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\") " pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.605641 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d8a630-8f89-44aa-9f24-2f1b279cccfd-operator-scripts\") pod \"root-account-create-update-pjmfd\" (UID: \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\") " pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.606565 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d8a630-8f89-44aa-9f24-2f1b279cccfd-operator-scripts\") pod \"root-account-create-update-pjmfd\" (UID: \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\") " pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.629284 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s867q\" (UniqueName: \"kubernetes.io/projected/51d8a630-8f89-44aa-9f24-2f1b279cccfd-kube-api-access-s867q\") pod \"root-account-create-update-pjmfd\" (UID: \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\") " pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:49 crc kubenswrapper[4732]: I0131 09:13:49.737075 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:50 crc kubenswrapper[4732]: I0131 09:13:50.161020 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/root-account-create-update-pjmfd"] Jan 31 09:13:50 crc kubenswrapper[4732]: W0131 09:13:50.170282 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51d8a630_8f89_44aa_9f24_2f1b279cccfd.slice/crio-5501f4bdd15cc3ad3ae8e942228bb8dad6ce22eba0b56a8c59b70902b46cc1f9 WatchSource:0}: Error finding container 5501f4bdd15cc3ad3ae8e942228bb8dad6ce22eba0b56a8c59b70902b46cc1f9: Status 404 returned error can't find the container with id 5501f4bdd15cc3ad3ae8e942228bb8dad6ce22eba0b56a8c59b70902b46cc1f9 Jan 31 09:13:50 crc kubenswrapper[4732]: I0131 09:13:50.641047 4732 generic.go:334] "Generic (PLEG): container finished" podID="51d8a630-8f89-44aa-9f24-2f1b279cccfd" containerID="9e931ecd0e73f528e7ed41d0a730c73d763f5b49adf745bec9e51e72d6012d62" exitCode=0 Jan 31 09:13:50 crc kubenswrapper[4732]: I0131 09:13:50.641933 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-pjmfd" event={"ID":"51d8a630-8f89-44aa-9f24-2f1b279cccfd","Type":"ContainerDied","Data":"9e931ecd0e73f528e7ed41d0a730c73d763f5b49adf745bec9e51e72d6012d62"} Jan 31 09:13:50 crc kubenswrapper[4732]: I0131 09:13:50.641966 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-pjmfd" event={"ID":"51d8a630-8f89-44aa-9f24-2f1b279cccfd","Type":"ContainerStarted","Data":"5501f4bdd15cc3ad3ae8e942228bb8dad6ce22eba0b56a8c59b70902b46cc1f9"} Jan 31 09:13:51 crc kubenswrapper[4732]: I0131 09:13:51.928381 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:52 crc kubenswrapper[4732]: I0131 09:13:52.042855 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d8a630-8f89-44aa-9f24-2f1b279cccfd-operator-scripts\") pod \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\" (UID: \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\") " Jan 31 09:13:52 crc kubenswrapper[4732]: I0131 09:13:52.042950 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s867q\" (UniqueName: \"kubernetes.io/projected/51d8a630-8f89-44aa-9f24-2f1b279cccfd-kube-api-access-s867q\") pod \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\" (UID: \"51d8a630-8f89-44aa-9f24-2f1b279cccfd\") " Jan 31 09:13:52 crc kubenswrapper[4732]: I0131 09:13:52.043577 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d8a630-8f89-44aa-9f24-2f1b279cccfd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51d8a630-8f89-44aa-9f24-2f1b279cccfd" (UID: "51d8a630-8f89-44aa-9f24-2f1b279cccfd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:13:52 crc kubenswrapper[4732]: I0131 09:13:52.049824 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d8a630-8f89-44aa-9f24-2f1b279cccfd-kube-api-access-s867q" (OuterVolumeSpecName: "kube-api-access-s867q") pod "51d8a630-8f89-44aa-9f24-2f1b279cccfd" (UID: "51d8a630-8f89-44aa-9f24-2f1b279cccfd"). InnerVolumeSpecName "kube-api-access-s867q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:13:52 crc kubenswrapper[4732]: I0131 09:13:52.144665 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51d8a630-8f89-44aa-9f24-2f1b279cccfd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:13:52 crc kubenswrapper[4732]: I0131 09:13:52.144736 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s867q\" (UniqueName: \"kubernetes.io/projected/51d8a630-8f89-44aa-9f24-2f1b279cccfd-kube-api-access-s867q\") on node \"crc\" DevicePath \"\"" Jan 31 09:13:52 crc kubenswrapper[4732]: I0131 09:13:52.656143 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-pjmfd" event={"ID":"51d8a630-8f89-44aa-9f24-2f1b279cccfd","Type":"ContainerDied","Data":"5501f4bdd15cc3ad3ae8e942228bb8dad6ce22eba0b56a8c59b70902b46cc1f9"} Jan 31 09:13:52 crc kubenswrapper[4732]: I0131 09:13:52.656184 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5501f4bdd15cc3ad3ae8e942228bb8dad6ce22eba0b56a8c59b70902b46cc1f9" Jan 31 09:13:52 crc kubenswrapper[4732]: I0131 09:13:52.656548 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-pjmfd" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.305057 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl"] Jan 31 09:13:56 crc kubenswrapper[4732]: E0131 09:13:56.305922 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d8a630-8f89-44aa-9f24-2f1b279cccfd" containerName="mariadb-account-create-update" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.305941 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d8a630-8f89-44aa-9f24-2f1b279cccfd" containerName="mariadb-account-create-update" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.306075 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d8a630-8f89-44aa-9f24-2f1b279cccfd" containerName="mariadb-account-create-update" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.307144 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.309787 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tnztr" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.315265 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl"] Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.398344 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.398416 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.398450 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq98m\" (UniqueName: \"kubernetes.io/projected/86012593-15ec-4f3c-aaa4-c0522a918019-kube-api-access-bq98m\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.500208 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq98m\" (UniqueName: \"kubernetes.io/projected/86012593-15ec-4f3c-aaa4-c0522a918019-kube-api-access-bq98m\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.500347 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.500885 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.500969 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.501256 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.523773 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq98m\" (UniqueName: \"kubernetes.io/projected/86012593-15ec-4f3c-aaa4-c0522a918019-kube-api-access-bq98m\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:56 crc kubenswrapper[4732]: I0131 09:13:56.633422 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:13:57 crc kubenswrapper[4732]: I0131 09:13:57.121227 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl"] Jan 31 09:13:57 crc kubenswrapper[4732]: W0131 09:13:57.139161 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86012593_15ec_4f3c_aaa4_c0522a918019.slice/crio-30258f3e489223f658f207a8ef31d72f78818b3b43d7efd1cb289a60181d3e4c WatchSource:0}: Error finding container 30258f3e489223f658f207a8ef31d72f78818b3b43d7efd1cb289a60181d3e4c: Status 404 returned error can't find the container with id 30258f3e489223f658f207a8ef31d72f78818b3b43d7efd1cb289a60181d3e4c Jan 31 09:13:57 crc kubenswrapper[4732]: I0131 09:13:57.696591 4732 generic.go:334] "Generic (PLEG): container finished" podID="86012593-15ec-4f3c-aaa4-c0522a918019" containerID="5797783d92fce2d40396a5e635eb5cba9fcbde2d8555d22fc23d15fbbb6337bb" exitCode=0 Jan 31 09:13:57 crc kubenswrapper[4732]: I0131 09:13:57.696638 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" event={"ID":"86012593-15ec-4f3c-aaa4-c0522a918019","Type":"ContainerDied","Data":"5797783d92fce2d40396a5e635eb5cba9fcbde2d8555d22fc23d15fbbb6337bb"} Jan 31 09:13:57 crc kubenswrapper[4732]: I0131 09:13:57.696698 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" event={"ID":"86012593-15ec-4f3c-aaa4-c0522a918019","Type":"ContainerStarted","Data":"30258f3e489223f658f207a8ef31d72f78818b3b43d7efd1cb289a60181d3e4c"} Jan 31 09:13:58 crc kubenswrapper[4732]: I0131 09:13:58.708179 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" event={"ID":"86012593-15ec-4f3c-aaa4-c0522a918019","Type":"ContainerStarted","Data":"dddea144384fd97086387399ccac7f5e8ab364f8aa2e9d3b1d93f34363e9cf4d"} Jan 31 09:13:59 crc kubenswrapper[4732]: I0131 09:13:59.128872 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:59 crc kubenswrapper[4732]: I0131 09:13:59.207540 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:13:59 crc kubenswrapper[4732]: I0131 09:13:59.720803 4732 generic.go:334] "Generic (PLEG): container finished" podID="86012593-15ec-4f3c-aaa4-c0522a918019" containerID="dddea144384fd97086387399ccac7f5e8ab364f8aa2e9d3b1d93f34363e9cf4d" exitCode=0 Jan 31 09:13:59 crc kubenswrapper[4732]: I0131 09:13:59.720853 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" event={"ID":"86012593-15ec-4f3c-aaa4-c0522a918019","Type":"ContainerDied","Data":"dddea144384fd97086387399ccac7f5e8ab364f8aa2e9d3b1d93f34363e9cf4d"} Jan 31 09:14:00 crc kubenswrapper[4732]: I0131 09:14:00.728356 4732 generic.go:334] "Generic (PLEG): container finished" podID="86012593-15ec-4f3c-aaa4-c0522a918019" containerID="87f7881198e2bf8721939abbe73ccd62c3cbc8681018136ac024bf795609a719" exitCode=0 Jan 31 09:14:00 crc kubenswrapper[4732]: I0131 09:14:00.728586 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" event={"ID":"86012593-15ec-4f3c-aaa4-c0522a918019","Type":"ContainerDied","Data":"87f7881198e2bf8721939abbe73ccd62c3cbc8681018136ac024bf795609a719"} Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.097020 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.181484 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq98m\" (UniqueName: \"kubernetes.io/projected/86012593-15ec-4f3c-aaa4-c0522a918019-kube-api-access-bq98m\") pod \"86012593-15ec-4f3c-aaa4-c0522a918019\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.181545 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-bundle\") pod \"86012593-15ec-4f3c-aaa4-c0522a918019\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.181576 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-util\") pod \"86012593-15ec-4f3c-aaa4-c0522a918019\" (UID: \"86012593-15ec-4f3c-aaa4-c0522a918019\") " Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.183157 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-bundle" (OuterVolumeSpecName: "bundle") pod "86012593-15ec-4f3c-aaa4-c0522a918019" (UID: "86012593-15ec-4f3c-aaa4-c0522a918019"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.189950 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86012593-15ec-4f3c-aaa4-c0522a918019-kube-api-access-bq98m" (OuterVolumeSpecName: "kube-api-access-bq98m") pod "86012593-15ec-4f3c-aaa4-c0522a918019" (UID: "86012593-15ec-4f3c-aaa4-c0522a918019"). InnerVolumeSpecName "kube-api-access-bq98m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.194140 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-util" (OuterVolumeSpecName: "util") pod "86012593-15ec-4f3c-aaa4-c0522a918019" (UID: "86012593-15ec-4f3c-aaa4-c0522a918019"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.282976 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq98m\" (UniqueName: \"kubernetes.io/projected/86012593-15ec-4f3c-aaa4-c0522a918019-kube-api-access-bq98m\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.283016 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.283030 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86012593-15ec-4f3c-aaa4-c0522a918019-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.746781 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" event={"ID":"86012593-15ec-4f3c-aaa4-c0522a918019","Type":"ContainerDied","Data":"30258f3e489223f658f207a8ef31d72f78818b3b43d7efd1cb289a60181d3e4c"} Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.747188 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30258f3e489223f658f207a8ef31d72f78818b3b43d7efd1cb289a60181d3e4c" Jan 31 09:14:02 crc kubenswrapper[4732]: I0131 09:14:02.746947 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl" Jan 31 09:14:03 crc kubenswrapper[4732]: I0131 09:14:03.226562 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:14:03 crc kubenswrapper[4732]: I0131 09:14:03.286609 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.196830 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954"] Jan 31 09:14:09 crc kubenswrapper[4732]: E0131 09:14:09.197440 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86012593-15ec-4f3c-aaa4-c0522a918019" containerName="util" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.197452 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="86012593-15ec-4f3c-aaa4-c0522a918019" containerName="util" Jan 31 09:14:09 crc kubenswrapper[4732]: E0131 09:14:09.197469 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86012593-15ec-4f3c-aaa4-c0522a918019" containerName="extract" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.197475 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="86012593-15ec-4f3c-aaa4-c0522a918019" containerName="extract" Jan 31 09:14:09 crc kubenswrapper[4732]: E0131 09:14:09.197488 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86012593-15ec-4f3c-aaa4-c0522a918019" containerName="pull" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.197495 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="86012593-15ec-4f3c-aaa4-c0522a918019" containerName="pull" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.197597 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="86012593-15ec-4f3c-aaa4-c0522a918019" containerName="extract" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.198028 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.200287 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-6ch9n" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.241537 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954"] Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.273861 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztcb8\" (UniqueName: \"kubernetes.io/projected/79621d02-e834-4725-8b80-d0444f3b6487-kube-api-access-ztcb8\") pod \"rabbitmq-cluster-operator-779fc9694b-2t954\" (UID: \"79621d02-e834-4725-8b80-d0444f3b6487\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.375813 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztcb8\" (UniqueName: \"kubernetes.io/projected/79621d02-e834-4725-8b80-d0444f3b6487-kube-api-access-ztcb8\") pod \"rabbitmq-cluster-operator-779fc9694b-2t954\" (UID: \"79621d02-e834-4725-8b80-d0444f3b6487\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.408269 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztcb8\" (UniqueName: \"kubernetes.io/projected/79621d02-e834-4725-8b80-d0444f3b6487-kube-api-access-ztcb8\") pod \"rabbitmq-cluster-operator-779fc9694b-2t954\" (UID: \"79621d02-e834-4725-8b80-d0444f3b6487\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.518467 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" Jan 31 09:14:09 crc kubenswrapper[4732]: I0131 09:14:09.978694 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954"] Jan 31 09:14:10 crc kubenswrapper[4732]: I0131 09:14:10.797397 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" event={"ID":"79621d02-e834-4725-8b80-d0444f3b6487","Type":"ContainerStarted","Data":"2cb30d5ff86683ad564f64402e0e4a2144f56764496f3cb2ead909cfbd0f5de4"} Jan 31 09:14:13 crc kubenswrapper[4732]: I0131 09:14:13.820881 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" event={"ID":"79621d02-e834-4725-8b80-d0444f3b6487","Type":"ContainerStarted","Data":"288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b"} Jan 31 09:14:13 crc kubenswrapper[4732]: I0131 09:14:13.844377 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" podStartSLOduration=2.072218535 podStartE2EDuration="4.844362088s" podCreationTimestamp="2026-01-31 09:14:09 +0000 UTC" firstStartedPulling="2026-01-31 09:14:09.99436349 +0000 UTC m=+788.300239704" lastFinishedPulling="2026-01-31 09:14:12.766507033 +0000 UTC m=+791.072383257" observedRunningTime="2026-01-31 09:14:13.843930655 +0000 UTC m=+792.149806859" watchObservedRunningTime="2026-01-31 09:14:13.844362088 +0000 UTC m=+792.150238292" Jan 31 09:14:17 crc kubenswrapper[4732]: I0131 09:14:17.498009 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:14:17 crc kubenswrapper[4732]: I0131 09:14:17.498515 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:14:17 crc kubenswrapper[4732]: I0131 09:14:17.498602 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:14:17 crc kubenswrapper[4732]: I0131 09:14:17.499470 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e8d3fd1eb561cfd678a2a0df1de54d984c12dc8e05f74e816b693d4b18b74a20"} pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:14:17 crc kubenswrapper[4732]: I0131 09:14:17.499592 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" containerID="cri-o://e8d3fd1eb561cfd678a2a0df1de54d984c12dc8e05f74e816b693d4b18b74a20" gracePeriod=600 Jan 31 09:14:17 crc kubenswrapper[4732]: I0131 09:14:17.847052 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d790207-d357-4b47-87bf-5b505e061820" containerID="e8d3fd1eb561cfd678a2a0df1de54d984c12dc8e05f74e816b693d4b18b74a20" exitCode=0 Jan 31 09:14:17 crc kubenswrapper[4732]: I0131 09:14:17.847138 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerDied","Data":"e8d3fd1eb561cfd678a2a0df1de54d984c12dc8e05f74e816b693d4b18b74a20"} Jan 31 09:14:17 crc kubenswrapper[4732]: I0131 09:14:17.847383 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"7aa4af54c816ede2288939b5eacffccc23edb9afc7e2a36ef42fe01d52b4ae91"} Jan 31 09:14:17 crc kubenswrapper[4732]: I0131 09:14:17.847413 4732 scope.go:117] "RemoveContainer" containerID="942a11834ff55816d19ec94b72706370701e25dcee37029bb97b73b2e3078f9b" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.402129 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.404060 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.407445 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"rabbitmq-plugins-conf" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.407491 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-server-dockercfg-x8pqd" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.407543 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-default-user" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.407645 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"rabbitmq-server-conf" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.407894 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.408829 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.518935 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-017be409-2d02-48b0-bb67-3403111bd6b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.518986 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.519004 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.519031 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxmtn\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-kube-api-access-xxmtn\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.519905 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.520419 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.520636 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.520928 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.621936 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.621994 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.622042 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.622121 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-017be409-2d02-48b0-bb67-3403111bd6b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.622158 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.622195 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.622236 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxmtn\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-kube-api-access-xxmtn\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.622259 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.623187 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.623337 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.624321 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.628459 4732 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.628509 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-017be409-2d02-48b0-bb67-3403111bd6b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b8d3f8071f0106d74010c769abe69563bad433a0e8012e7bb99dbaf8960d3d0a/globalmount\"" pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.629525 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.629590 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.635599 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.645827 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxmtn\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-kube-api-access-xxmtn\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.654398 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-017be409-2d02-48b0-bb67-3403111bd6b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9\") pod \"rabbitmq-server-0\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.721153 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:14:20 crc kubenswrapper[4732]: I0131 09:14:20.925390 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Jan 31 09:14:20 crc kubenswrapper[4732]: W0131 09:14:20.948041 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc71a61f_ccf2_43bb_aedb_71f2ec9f03bd.slice/crio-c0069b5700dea7f3177d96246d005bd742c680b2404c8813992998fda3388237 WatchSource:0}: Error finding container c0069b5700dea7f3177d96246d005bd742c680b2404c8813992998fda3388237: Status 404 returned error can't find the container with id c0069b5700dea7f3177d96246d005bd742c680b2404c8813992998fda3388237 Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.444273 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-hbpcb"] Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.444997 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.448395 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-lsl78" Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.462705 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-hbpcb"] Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.533594 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrxr\" (UniqueName: \"kubernetes.io/projected/e617e130-c338-40d1-9a5c-83e925a4e6ed-kube-api-access-cnrxr\") pod \"keystone-operator-index-hbpcb\" (UID: \"e617e130-c338-40d1-9a5c-83e925a4e6ed\") " pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.634403 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrxr\" (UniqueName: \"kubernetes.io/projected/e617e130-c338-40d1-9a5c-83e925a4e6ed-kube-api-access-cnrxr\") pod \"keystone-operator-index-hbpcb\" (UID: \"e617e130-c338-40d1-9a5c-83e925a4e6ed\") " pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.658714 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrxr\" (UniqueName: \"kubernetes.io/projected/e617e130-c338-40d1-9a5c-83e925a4e6ed-kube-api-access-cnrxr\") pod \"keystone-operator-index-hbpcb\" (UID: \"e617e130-c338-40d1-9a5c-83e925a4e6ed\") " pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.769748 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.879937 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd","Type":"ContainerStarted","Data":"c0069b5700dea7f3177d96246d005bd742c680b2404c8813992998fda3388237"} Jan 31 09:14:21 crc kubenswrapper[4732]: I0131 09:14:21.971814 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-hbpcb"] Jan 31 09:14:22 crc kubenswrapper[4732]: I0131 09:14:22.901061 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-hbpcb" event={"ID":"e617e130-c338-40d1-9a5c-83e925a4e6ed","Type":"ContainerStarted","Data":"6b9559657e742c2dca3e80a6f30ed1e3ae9bae7e31be4ed1e6ca772141139c64"} Jan 31 09:14:26 crc kubenswrapper[4732]: I0131 09:14:26.930212 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-hbpcb" event={"ID":"e617e130-c338-40d1-9a5c-83e925a4e6ed","Type":"ContainerStarted","Data":"c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d"} Jan 31 09:14:26 crc kubenswrapper[4732]: I0131 09:14:26.952503 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-hbpcb" podStartSLOduration=2.27136689 podStartE2EDuration="5.952483926s" podCreationTimestamp="2026-01-31 09:14:21 +0000 UTC" firstStartedPulling="2026-01-31 09:14:21.986829917 +0000 UTC m=+800.292706121" lastFinishedPulling="2026-01-31 09:14:25.667946953 +0000 UTC m=+803.973823157" observedRunningTime="2026-01-31 09:14:26.950775924 +0000 UTC m=+805.256652128" watchObservedRunningTime="2026-01-31 09:14:26.952483926 +0000 UTC m=+805.258360140" Jan 31 09:14:27 crc kubenswrapper[4732]: I0131 09:14:27.941058 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd","Type":"ContainerStarted","Data":"cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb"} Jan 31 09:14:31 crc kubenswrapper[4732]: I0131 09:14:31.769840 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:14:31 crc kubenswrapper[4732]: I0131 09:14:31.770587 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:14:31 crc kubenswrapper[4732]: I0131 09:14:31.806802 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:14:32 crc kubenswrapper[4732]: I0131 09:14:32.013661 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.287596 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj"] Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.289284 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.292958 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tnztr" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.303309 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj"] Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.397126 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-util\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.397458 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-bundle\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.397601 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg47l\" (UniqueName: \"kubernetes.io/projected/00bc065b-6932-4b27-bd33-5d8618f8a4f1-kube-api-access-lg47l\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.499278 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-util\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.499384 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-bundle\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.499478 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg47l\" (UniqueName: \"kubernetes.io/projected/00bc065b-6932-4b27-bd33-5d8618f8a4f1-kube-api-access-lg47l\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.500329 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-util\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.500559 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-bundle\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.538182 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg47l\" (UniqueName: \"kubernetes.io/projected/00bc065b-6932-4b27-bd33-5d8618f8a4f1-kube-api-access-lg47l\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:39 crc kubenswrapper[4732]: I0131 09:14:39.607062 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:40 crc kubenswrapper[4732]: I0131 09:14:40.107633 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj"] Jan 31 09:14:40 crc kubenswrapper[4732]: W0131 09:14:40.110300 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00bc065b_6932_4b27_bd33_5d8618f8a4f1.slice/crio-39e56caeeaf6133372e6dc2f21b7538a05d6ef95b092a2c87140dea950c8850a WatchSource:0}: Error finding container 39e56caeeaf6133372e6dc2f21b7538a05d6ef95b092a2c87140dea950c8850a: Status 404 returned error can't find the container with id 39e56caeeaf6133372e6dc2f21b7538a05d6ef95b092a2c87140dea950c8850a Jan 31 09:14:41 crc kubenswrapper[4732]: I0131 09:14:41.046747 4732 generic.go:334] "Generic (PLEG): container finished" podID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerID="75bdd5fe1a4f1b74d5a4dbcdde8f474e6bc05519374a21ed6a1e8f88735183f3" exitCode=0 Jan 31 09:14:41 crc kubenswrapper[4732]: I0131 09:14:41.046802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" event={"ID":"00bc065b-6932-4b27-bd33-5d8618f8a4f1","Type":"ContainerDied","Data":"75bdd5fe1a4f1b74d5a4dbcdde8f474e6bc05519374a21ed6a1e8f88735183f3"} Jan 31 09:14:41 crc kubenswrapper[4732]: I0131 09:14:41.047058 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" event={"ID":"00bc065b-6932-4b27-bd33-5d8618f8a4f1","Type":"ContainerStarted","Data":"39e56caeeaf6133372e6dc2f21b7538a05d6ef95b092a2c87140dea950c8850a"} Jan 31 09:14:42 crc kubenswrapper[4732]: I0131 09:14:42.059260 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" event={"ID":"00bc065b-6932-4b27-bd33-5d8618f8a4f1","Type":"ContainerStarted","Data":"a37225baa762efba0da855108c4e3b4942f286a7d0c711d1300d72e14f4f3a67"} Jan 31 09:14:43 crc kubenswrapper[4732]: I0131 09:14:43.069322 4732 generic.go:334] "Generic (PLEG): container finished" podID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerID="a37225baa762efba0da855108c4e3b4942f286a7d0c711d1300d72e14f4f3a67" exitCode=0 Jan 31 09:14:43 crc kubenswrapper[4732]: I0131 09:14:43.069396 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" event={"ID":"00bc065b-6932-4b27-bd33-5d8618f8a4f1","Type":"ContainerDied","Data":"a37225baa762efba0da855108c4e3b4942f286a7d0c711d1300d72e14f4f3a67"} Jan 31 09:14:44 crc kubenswrapper[4732]: I0131 09:14:44.080520 4732 generic.go:334] "Generic (PLEG): container finished" podID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerID="e8c730305c3c39eb9242a11659fb49c02690f7b2e9c98fd70ae0da288de53e06" exitCode=0 Jan 31 09:14:44 crc kubenswrapper[4732]: I0131 09:14:44.080582 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" event={"ID":"00bc065b-6932-4b27-bd33-5d8618f8a4f1","Type":"ContainerDied","Data":"e8c730305c3c39eb9242a11659fb49c02690f7b2e9c98fd70ae0da288de53e06"} Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.358163 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.485105 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg47l\" (UniqueName: \"kubernetes.io/projected/00bc065b-6932-4b27-bd33-5d8618f8a4f1-kube-api-access-lg47l\") pod \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.485171 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-bundle\") pod \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.485286 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-util\") pod \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\" (UID: \"00bc065b-6932-4b27-bd33-5d8618f8a4f1\") " Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.486226 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-bundle" (OuterVolumeSpecName: "bundle") pod "00bc065b-6932-4b27-bd33-5d8618f8a4f1" (UID: "00bc065b-6932-4b27-bd33-5d8618f8a4f1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.492823 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00bc065b-6932-4b27-bd33-5d8618f8a4f1-kube-api-access-lg47l" (OuterVolumeSpecName: "kube-api-access-lg47l") pod "00bc065b-6932-4b27-bd33-5d8618f8a4f1" (UID: "00bc065b-6932-4b27-bd33-5d8618f8a4f1"). InnerVolumeSpecName "kube-api-access-lg47l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.499822 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-util" (OuterVolumeSpecName: "util") pod "00bc065b-6932-4b27-bd33-5d8618f8a4f1" (UID: "00bc065b-6932-4b27-bd33-5d8618f8a4f1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.586988 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg47l\" (UniqueName: \"kubernetes.io/projected/00bc065b-6932-4b27-bd33-5d8618f8a4f1-kube-api-access-lg47l\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.587183 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:45 crc kubenswrapper[4732]: I0131 09:14:45.587273 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/00bc065b-6932-4b27-bd33-5d8618f8a4f1-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:14:46 crc kubenswrapper[4732]: I0131 09:14:46.097019 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" event={"ID":"00bc065b-6932-4b27-bd33-5d8618f8a4f1","Type":"ContainerDied","Data":"39e56caeeaf6133372e6dc2f21b7538a05d6ef95b092a2c87140dea950c8850a"} Jan 31 09:14:46 crc kubenswrapper[4732]: I0131 09:14:46.097060 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e56caeeaf6133372e6dc2f21b7538a05d6ef95b092a2c87140dea950c8850a" Jan 31 09:14:46 crc kubenswrapper[4732]: I0131 09:14:46.097096 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.147988 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr"] Jan 31 09:15:00 crc kubenswrapper[4732]: E0131 09:15:00.148516 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerName="util" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.148526 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerName="util" Jan 31 09:15:00 crc kubenswrapper[4732]: E0131 09:15:00.148541 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerName="pull" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.148546 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerName="pull" Jan 31 09:15:00 crc kubenswrapper[4732]: E0131 09:15:00.148554 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerName="extract" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.148559 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerName="extract" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.148675 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" containerName="extract" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.149051 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.152326 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.152751 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.165414 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr"] Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.215724 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a93c21ce-b808-4d18-a859-28ff7552d95f-secret-volume\") pod \"collect-profiles-29497515-gkkgr\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.215799 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg94k\" (UniqueName: \"kubernetes.io/projected/a93c21ce-b808-4d18-a859-28ff7552d95f-kube-api-access-vg94k\") pod \"collect-profiles-29497515-gkkgr\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.215847 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a93c21ce-b808-4d18-a859-28ff7552d95f-config-volume\") pod \"collect-profiles-29497515-gkkgr\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.311748 4732 generic.go:334] "Generic (PLEG): container finished" podID="dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" containerID="cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb" exitCode=0 Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.311796 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd","Type":"ContainerDied","Data":"cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb"} Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.316528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a93c21ce-b808-4d18-a859-28ff7552d95f-config-volume\") pod \"collect-profiles-29497515-gkkgr\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.316614 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a93c21ce-b808-4d18-a859-28ff7552d95f-secret-volume\") pod \"collect-profiles-29497515-gkkgr\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.316633 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg94k\" (UniqueName: \"kubernetes.io/projected/a93c21ce-b808-4d18-a859-28ff7552d95f-kube-api-access-vg94k\") pod \"collect-profiles-29497515-gkkgr\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.317457 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a93c21ce-b808-4d18-a859-28ff7552d95f-config-volume\") pod \"collect-profiles-29497515-gkkgr\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.333403 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a93c21ce-b808-4d18-a859-28ff7552d95f-secret-volume\") pod \"collect-profiles-29497515-gkkgr\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.338013 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg94k\" (UniqueName: \"kubernetes.io/projected/a93c21ce-b808-4d18-a859-28ff7552d95f-kube-api-access-vg94k\") pod \"collect-profiles-29497515-gkkgr\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.483996 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:00 crc kubenswrapper[4732]: I0131 09:15:00.912920 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr"] Jan 31 09:15:00 crc kubenswrapper[4732]: W0131 09:15:00.915197 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda93c21ce_b808_4d18_a859_28ff7552d95f.slice/crio-43927a110664fe95c7a0ead398c85a94f46078c8752304bb49ce1c7ab5c6a77d WatchSource:0}: Error finding container 43927a110664fe95c7a0ead398c85a94f46078c8752304bb49ce1c7ab5c6a77d: Status 404 returned error can't find the container with id 43927a110664fe95c7a0ead398c85a94f46078c8752304bb49ce1c7ab5c6a77d Jan 31 09:15:01 crc kubenswrapper[4732]: I0131 09:15:01.320297 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd","Type":"ContainerStarted","Data":"5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d"} Jan 31 09:15:01 crc kubenswrapper[4732]: I0131 09:15:01.320569 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:15:01 crc kubenswrapper[4732]: I0131 09:15:01.324233 4732 generic.go:334] "Generic (PLEG): container finished" podID="a93c21ce-b808-4d18-a859-28ff7552d95f" containerID="ff356dbcb2d64501fe1fc7779c8190b93c33e26705936fd0f0080aa9e9d8110a" exitCode=0 Jan 31 09:15:01 crc kubenswrapper[4732]: I0131 09:15:01.324281 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" event={"ID":"a93c21ce-b808-4d18-a859-28ff7552d95f","Type":"ContainerDied","Data":"ff356dbcb2d64501fe1fc7779c8190b93c33e26705936fd0f0080aa9e9d8110a"} Jan 31 09:15:01 crc kubenswrapper[4732]: I0131 09:15:01.324314 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" event={"ID":"a93c21ce-b808-4d18-a859-28ff7552d95f","Type":"ContainerStarted","Data":"43927a110664fe95c7a0ead398c85a94f46078c8752304bb49ce1c7ab5c6a77d"} Jan 31 09:15:01 crc kubenswrapper[4732]: I0131 09:15:01.343303 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.081265452 podStartE2EDuration="42.343285824s" podCreationTimestamp="2026-01-31 09:14:19 +0000 UTC" firstStartedPulling="2026-01-31 09:14:20.951751724 +0000 UTC m=+799.257627938" lastFinishedPulling="2026-01-31 09:14:26.213772076 +0000 UTC m=+804.519648310" observedRunningTime="2026-01-31 09:15:01.341980024 +0000 UTC m=+839.647856238" watchObservedRunningTime="2026-01-31 09:15:01.343285824 +0000 UTC m=+839.649162038" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.386098 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq"] Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.387046 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.388810 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xn5gt" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.389287 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.400240 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq"] Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.547615 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-apiservice-cert\") pod \"keystone-operator-controller-manager-57cdd4758c-lh9cq\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.547691 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-webhook-cert\") pod \"keystone-operator-controller-manager-57cdd4758c-lh9cq\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.547723 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7m6t\" (UniqueName: \"kubernetes.io/projected/241eae26-3908-40e0-af9c-59b54a6ab1a0-kube-api-access-r7m6t\") pod \"keystone-operator-controller-manager-57cdd4758c-lh9cq\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.648993 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-apiservice-cert\") pod \"keystone-operator-controller-manager-57cdd4758c-lh9cq\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.649050 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-webhook-cert\") pod \"keystone-operator-controller-manager-57cdd4758c-lh9cq\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.649076 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7m6t\" (UniqueName: \"kubernetes.io/projected/241eae26-3908-40e0-af9c-59b54a6ab1a0-kube-api-access-r7m6t\") pod \"keystone-operator-controller-manager-57cdd4758c-lh9cq\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.649849 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.651141 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.667962 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-webhook-cert\") pod \"keystone-operator-controller-manager-57cdd4758c-lh9cq\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.672044 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7m6t\" (UniqueName: \"kubernetes.io/projected/241eae26-3908-40e0-af9c-59b54a6ab1a0-kube-api-access-r7m6t\") pod \"keystone-operator-controller-manager-57cdd4758c-lh9cq\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.672736 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-apiservice-cert\") pod \"keystone-operator-controller-manager-57cdd4758c-lh9cq\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.712342 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xn5gt" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.721041 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.750412 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg94k\" (UniqueName: \"kubernetes.io/projected/a93c21ce-b808-4d18-a859-28ff7552d95f-kube-api-access-vg94k\") pod \"a93c21ce-b808-4d18-a859-28ff7552d95f\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.750491 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a93c21ce-b808-4d18-a859-28ff7552d95f-secret-volume\") pod \"a93c21ce-b808-4d18-a859-28ff7552d95f\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.750604 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a93c21ce-b808-4d18-a859-28ff7552d95f-config-volume\") pod \"a93c21ce-b808-4d18-a859-28ff7552d95f\" (UID: \"a93c21ce-b808-4d18-a859-28ff7552d95f\") " Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.751404 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a93c21ce-b808-4d18-a859-28ff7552d95f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a93c21ce-b808-4d18-a859-28ff7552d95f" (UID: "a93c21ce-b808-4d18-a859-28ff7552d95f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.754909 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a93c21ce-b808-4d18-a859-28ff7552d95f-kube-api-access-vg94k" (OuterVolumeSpecName: "kube-api-access-vg94k") pod "a93c21ce-b808-4d18-a859-28ff7552d95f" (UID: "a93c21ce-b808-4d18-a859-28ff7552d95f"). InnerVolumeSpecName "kube-api-access-vg94k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.755529 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a93c21ce-b808-4d18-a859-28ff7552d95f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a93c21ce-b808-4d18-a859-28ff7552d95f" (UID: "a93c21ce-b808-4d18-a859-28ff7552d95f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.857792 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg94k\" (UniqueName: \"kubernetes.io/projected/a93c21ce-b808-4d18-a859-28ff7552d95f-kube-api-access-vg94k\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.857835 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a93c21ce-b808-4d18-a859-28ff7552d95f-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.857854 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a93c21ce-b808-4d18-a859-28ff7552d95f-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:02 crc kubenswrapper[4732]: I0131 09:15:02.916314 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq"] Jan 31 09:15:02 crc kubenswrapper[4732]: W0131 09:15:02.925893 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241eae26_3908_40e0_af9c_59b54a6ab1a0.slice/crio-0967cf246a03d050c7636d21dd8ba1334433e9343155d38efb7a34b23ed4beb6 WatchSource:0}: Error finding container 0967cf246a03d050c7636d21dd8ba1334433e9343155d38efb7a34b23ed4beb6: Status 404 returned error can't find the container with id 0967cf246a03d050c7636d21dd8ba1334433e9343155d38efb7a34b23ed4beb6 Jan 31 09:15:03 crc kubenswrapper[4732]: I0131 09:15:03.336184 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" event={"ID":"241eae26-3908-40e0-af9c-59b54a6ab1a0","Type":"ContainerStarted","Data":"0967cf246a03d050c7636d21dd8ba1334433e9343155d38efb7a34b23ed4beb6"} Jan 31 09:15:03 crc kubenswrapper[4732]: I0131 09:15:03.337600 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" event={"ID":"a93c21ce-b808-4d18-a859-28ff7552d95f","Type":"ContainerDied","Data":"43927a110664fe95c7a0ead398c85a94f46078c8752304bb49ce1c7ab5c6a77d"} Jan 31 09:15:03 crc kubenswrapper[4732]: I0131 09:15:03.337630 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43927a110664fe95c7a0ead398c85a94f46078c8752304bb49ce1c7ab5c6a77d" Jan 31 09:15:03 crc kubenswrapper[4732]: I0131 09:15:03.337709 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497515-gkkgr" Jan 31 09:15:07 crc kubenswrapper[4732]: I0131 09:15:07.366049 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" event={"ID":"241eae26-3908-40e0-af9c-59b54a6ab1a0","Type":"ContainerStarted","Data":"33284c815fdcf867d64c94279b754de6fc3557337b45dc375a7c56112e2acde1"} Jan 31 09:15:07 crc kubenswrapper[4732]: I0131 09:15:07.367502 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:07 crc kubenswrapper[4732]: I0131 09:15:07.388396 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" podStartSLOduration=1.6594749819999999 podStartE2EDuration="5.388375623s" podCreationTimestamp="2026-01-31 09:15:02 +0000 UTC" firstStartedPulling="2026-01-31 09:15:02.929236057 +0000 UTC m=+841.235112261" lastFinishedPulling="2026-01-31 09:15:06.658136698 +0000 UTC m=+844.964012902" observedRunningTime="2026-01-31 09:15:07.38133627 +0000 UTC m=+845.687212474" watchObservedRunningTime="2026-01-31 09:15:07.388375623 +0000 UTC m=+845.694251827" Jan 31 09:15:10 crc kubenswrapper[4732]: I0131 09:15:10.725135 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:15:12 crc kubenswrapper[4732]: I0131 09:15:12.725823 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.244828 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-52lrw"] Jan 31 09:15:16 crc kubenswrapper[4732]: E0131 09:15:16.245328 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a93c21ce-b808-4d18-a859-28ff7552d95f" containerName="collect-profiles" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.245342 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a93c21ce-b808-4d18-a859-28ff7552d95f" containerName="collect-profiles" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.245450 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a93c21ce-b808-4d18-a859-28ff7552d95f" containerName="collect-profiles" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.245863 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-52lrw" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.247816 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-index-dockercfg-gk956" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.254717 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-52lrw"] Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.361315 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbdx6\" (UniqueName: \"kubernetes.io/projected/9fa1ac28-f4fc-49a1-8d64-5fa6b4858596-kube-api-access-sbdx6\") pod \"barbican-operator-index-52lrw\" (UID: \"9fa1ac28-f4fc-49a1-8d64-5fa6b4858596\") " pod="openstack-operators/barbican-operator-index-52lrw" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.463198 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbdx6\" (UniqueName: \"kubernetes.io/projected/9fa1ac28-f4fc-49a1-8d64-5fa6b4858596-kube-api-access-sbdx6\") pod \"barbican-operator-index-52lrw\" (UID: \"9fa1ac28-f4fc-49a1-8d64-5fa6b4858596\") " pod="openstack-operators/barbican-operator-index-52lrw" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.488531 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbdx6\" (UniqueName: \"kubernetes.io/projected/9fa1ac28-f4fc-49a1-8d64-5fa6b4858596-kube-api-access-sbdx6\") pod \"barbican-operator-index-52lrw\" (UID: \"9fa1ac28-f4fc-49a1-8d64-5fa6b4858596\") " pod="openstack-operators/barbican-operator-index-52lrw" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.563270 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-52lrw" Jan 31 09:15:16 crc kubenswrapper[4732]: I0131 09:15:16.903337 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-52lrw"] Jan 31 09:15:17 crc kubenswrapper[4732]: I0131 09:15:17.432934 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-52lrw" event={"ID":"9fa1ac28-f4fc-49a1-8d64-5fa6b4858596","Type":"ContainerStarted","Data":"980150e42c0e8a8fe248eee466f92a2ae0628871b51319a868cf69417f0d3e9a"} Jan 31 09:15:18 crc kubenswrapper[4732]: I0131 09:15:18.441529 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-52lrw" event={"ID":"9fa1ac28-f4fc-49a1-8d64-5fa6b4858596","Type":"ContainerStarted","Data":"356156d32df0119af1ede0bf03f74050fd94e05ed5f478849118068dd3204a84"} Jan 31 09:15:18 crc kubenswrapper[4732]: I0131 09:15:18.458228 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-52lrw" podStartSLOduration=1.443618227 podStartE2EDuration="2.458206159s" podCreationTimestamp="2026-01-31 09:15:16 +0000 UTC" firstStartedPulling="2026-01-31 09:15:16.907269686 +0000 UTC m=+855.213145890" lastFinishedPulling="2026-01-31 09:15:17.921857628 +0000 UTC m=+856.227733822" observedRunningTime="2026-01-31 09:15:18.454909574 +0000 UTC m=+856.760785798" watchObservedRunningTime="2026-01-31 09:15:18.458206159 +0000 UTC m=+856.764082363" Jan 31 09:15:21 crc kubenswrapper[4732]: I0131 09:15:21.236147 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-52lrw"] Jan 31 09:15:21 crc kubenswrapper[4732]: I0131 09:15:21.236700 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-52lrw" podUID="9fa1ac28-f4fc-49a1-8d64-5fa6b4858596" containerName="registry-server" containerID="cri-o://356156d32df0119af1ede0bf03f74050fd94e05ed5f478849118068dd3204a84" gracePeriod=2 Jan 31 09:15:21 crc kubenswrapper[4732]: I0131 09:15:21.464394 4732 generic.go:334] "Generic (PLEG): container finished" podID="9fa1ac28-f4fc-49a1-8d64-5fa6b4858596" containerID="356156d32df0119af1ede0bf03f74050fd94e05ed5f478849118068dd3204a84" exitCode=0 Jan 31 09:15:21 crc kubenswrapper[4732]: I0131 09:15:21.464519 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-52lrw" event={"ID":"9fa1ac28-f4fc-49a1-8d64-5fa6b4858596","Type":"ContainerDied","Data":"356156d32df0119af1ede0bf03f74050fd94e05ed5f478849118068dd3204a84"} Jan 31 09:15:21 crc kubenswrapper[4732]: I0131 09:15:21.947179 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-52lrw" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.046258 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbdx6\" (UniqueName: \"kubernetes.io/projected/9fa1ac28-f4fc-49a1-8d64-5fa6b4858596-kube-api-access-sbdx6\") pod \"9fa1ac28-f4fc-49a1-8d64-5fa6b4858596\" (UID: \"9fa1ac28-f4fc-49a1-8d64-5fa6b4858596\") " Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.056982 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-ps2mw"] Jan 31 09:15:22 crc kubenswrapper[4732]: E0131 09:15:22.057371 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa1ac28-f4fc-49a1-8d64-5fa6b4858596" containerName="registry-server" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.057397 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa1ac28-f4fc-49a1-8d64-5fa6b4858596" containerName="registry-server" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.057594 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa1ac28-f4fc-49a1-8d64-5fa6b4858596" containerName="registry-server" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.058201 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.061605 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-ps2mw"] Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.063060 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa1ac28-f4fc-49a1-8d64-5fa6b4858596-kube-api-access-sbdx6" (OuterVolumeSpecName: "kube-api-access-sbdx6") pod "9fa1ac28-f4fc-49a1-8d64-5fa6b4858596" (UID: "9fa1ac28-f4fc-49a1-8d64-5fa6b4858596"). InnerVolumeSpecName "kube-api-access-sbdx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.149033 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj79r\" (UniqueName: \"kubernetes.io/projected/a6b3a350-8b41-44a0-a6b4-b957947e1df6-kube-api-access-cj79r\") pod \"barbican-operator-index-ps2mw\" (UID: \"a6b3a350-8b41-44a0-a6b4-b957947e1df6\") " pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.149136 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbdx6\" (UniqueName: \"kubernetes.io/projected/9fa1ac28-f4fc-49a1-8d64-5fa6b4858596-kube-api-access-sbdx6\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.250549 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj79r\" (UniqueName: \"kubernetes.io/projected/a6b3a350-8b41-44a0-a6b4-b957947e1df6-kube-api-access-cj79r\") pod \"barbican-operator-index-ps2mw\" (UID: \"a6b3a350-8b41-44a0-a6b4-b957947e1df6\") " pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.267061 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj79r\" (UniqueName: \"kubernetes.io/projected/a6b3a350-8b41-44a0-a6b4-b957947e1df6-kube-api-access-cj79r\") pod \"barbican-operator-index-ps2mw\" (UID: \"a6b3a350-8b41-44a0-a6b4-b957947e1df6\") " pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.382440 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.471729 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-52lrw" event={"ID":"9fa1ac28-f4fc-49a1-8d64-5fa6b4858596","Type":"ContainerDied","Data":"980150e42c0e8a8fe248eee466f92a2ae0628871b51319a868cf69417f0d3e9a"} Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.471784 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-52lrw" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.471811 4732 scope.go:117] "RemoveContainer" containerID="356156d32df0119af1ede0bf03f74050fd94e05ed5f478849118068dd3204a84" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.516833 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-52lrw"] Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.519530 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-52lrw"] Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.550375 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa1ac28-f4fc-49a1-8d64-5fa6b4858596" path="/var/lib/kubelet/pods/9fa1ac28-f4fc-49a1-8d64-5fa6b4858596/volumes" Jan 31 09:15:22 crc kubenswrapper[4732]: I0131 09:15:22.817189 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-ps2mw"] Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.250705 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nl4xf"] Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.252499 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.264007 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-utilities\") pod \"redhat-marketplace-nl4xf\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.264053 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wrv\" (UniqueName: \"kubernetes.io/projected/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-kube-api-access-r7wrv\") pod \"redhat-marketplace-nl4xf\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.264103 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-catalog-content\") pod \"redhat-marketplace-nl4xf\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.264797 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl4xf"] Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.365323 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-utilities\") pod \"redhat-marketplace-nl4xf\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.365842 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wrv\" (UniqueName: \"kubernetes.io/projected/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-kube-api-access-r7wrv\") pod \"redhat-marketplace-nl4xf\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.365994 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-catalog-content\") pod \"redhat-marketplace-nl4xf\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.366646 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-catalog-content\") pod \"redhat-marketplace-nl4xf\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.366735 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-utilities\") pod \"redhat-marketplace-nl4xf\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.421728 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wrv\" (UniqueName: \"kubernetes.io/projected/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-kube-api-access-r7wrv\") pod \"redhat-marketplace-nl4xf\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.479444 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-ps2mw" event={"ID":"a6b3a350-8b41-44a0-a6b4-b957947e1df6","Type":"ContainerStarted","Data":"6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192"} Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.479490 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-ps2mw" event={"ID":"a6b3a350-8b41-44a0-a6b4-b957947e1df6","Type":"ContainerStarted","Data":"a2b2ec2462bff27c657ef1956671a25bb4c7e593d5773eb486359c1043f19888"} Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.497394 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-ps2mw" podStartSLOduration=1.045962928 podStartE2EDuration="1.49737193s" podCreationTimestamp="2026-01-31 09:15:22 +0000 UTC" firstStartedPulling="2026-01-31 09:15:22.828530078 +0000 UTC m=+861.134406282" lastFinishedPulling="2026-01-31 09:15:23.27993908 +0000 UTC m=+861.585815284" observedRunningTime="2026-01-31 09:15:23.494922493 +0000 UTC m=+861.800798697" watchObservedRunningTime="2026-01-31 09:15:23.49737193 +0000 UTC m=+861.803248124" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.577129 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.771275 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-db-create-qtddl"] Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.772304 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.788599 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk"] Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.789436 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.793282 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-db-secret" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.802362 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-create-qtddl"] Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.814805 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk"] Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.872598 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99jr8\" (UniqueName: \"kubernetes.io/projected/c86b412a-c376-48cd-b724-77e5fb6c9347-kube-api-access-99jr8\") pod \"keystone-db-create-qtddl\" (UID: \"c86b412a-c376-48cd-b724-77e5fb6c9347\") " pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.872702 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86b412a-c376-48cd-b724-77e5fb6c9347-operator-scripts\") pod \"keystone-db-create-qtddl\" (UID: \"c86b412a-c376-48cd-b724-77e5fb6c9347\") " pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.872797 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hdrj\" (UniqueName: \"kubernetes.io/projected/92cd61e3-285c-42b8-b382-b8dde5e934b8-kube-api-access-5hdrj\") pod \"keystone-5b86-account-create-update-vh2kk\" (UID: \"92cd61e3-285c-42b8-b382-b8dde5e934b8\") " pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.872865 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cd61e3-285c-42b8-b382-b8dde5e934b8-operator-scripts\") pod \"keystone-5b86-account-create-update-vh2kk\" (UID: \"92cd61e3-285c-42b8-b382-b8dde5e934b8\") " pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.974197 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99jr8\" (UniqueName: \"kubernetes.io/projected/c86b412a-c376-48cd-b724-77e5fb6c9347-kube-api-access-99jr8\") pod \"keystone-db-create-qtddl\" (UID: \"c86b412a-c376-48cd-b724-77e5fb6c9347\") " pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.974274 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86b412a-c376-48cd-b724-77e5fb6c9347-operator-scripts\") pod \"keystone-db-create-qtddl\" (UID: \"c86b412a-c376-48cd-b724-77e5fb6c9347\") " pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.974334 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hdrj\" (UniqueName: \"kubernetes.io/projected/92cd61e3-285c-42b8-b382-b8dde5e934b8-kube-api-access-5hdrj\") pod \"keystone-5b86-account-create-update-vh2kk\" (UID: \"92cd61e3-285c-42b8-b382-b8dde5e934b8\") " pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.974461 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cd61e3-285c-42b8-b382-b8dde5e934b8-operator-scripts\") pod \"keystone-5b86-account-create-update-vh2kk\" (UID: \"92cd61e3-285c-42b8-b382-b8dde5e934b8\") " pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.975227 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86b412a-c376-48cd-b724-77e5fb6c9347-operator-scripts\") pod \"keystone-db-create-qtddl\" (UID: \"c86b412a-c376-48cd-b724-77e5fb6c9347\") " pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.975265 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cd61e3-285c-42b8-b382-b8dde5e934b8-operator-scripts\") pod \"keystone-5b86-account-create-update-vh2kk\" (UID: \"92cd61e3-285c-42b8-b382-b8dde5e934b8\") " pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:23 crc kubenswrapper[4732]: I0131 09:15:23.996551 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99jr8\" (UniqueName: \"kubernetes.io/projected/c86b412a-c376-48cd-b724-77e5fb6c9347-kube-api-access-99jr8\") pod \"keystone-db-create-qtddl\" (UID: \"c86b412a-c376-48cd-b724-77e5fb6c9347\") " pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:24 crc kubenswrapper[4732]: I0131 09:15:24.003233 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hdrj\" (UniqueName: \"kubernetes.io/projected/92cd61e3-285c-42b8-b382-b8dde5e934b8-kube-api-access-5hdrj\") pod \"keystone-5b86-account-create-update-vh2kk\" (UID: \"92cd61e3-285c-42b8-b382-b8dde5e934b8\") " pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:24 crc kubenswrapper[4732]: I0131 09:15:24.014072 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl4xf"] Jan 31 09:15:24 crc kubenswrapper[4732]: W0131 09:15:24.023438 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2febcab9_3048_4ea6_bd0d_ce40bf6bcda8.slice/crio-8580e16c6c2470255aa863b68b936832583b54119f231ae9e83c3edb8dea81b0 WatchSource:0}: Error finding container 8580e16c6c2470255aa863b68b936832583b54119f231ae9e83c3edb8dea81b0: Status 404 returned error can't find the container with id 8580e16c6c2470255aa863b68b936832583b54119f231ae9e83c3edb8dea81b0 Jan 31 09:15:24 crc kubenswrapper[4732]: I0131 09:15:24.089573 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:24 crc kubenswrapper[4732]: I0131 09:15:24.104053 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:24 crc kubenswrapper[4732]: I0131 09:15:24.487982 4732 generic.go:334] "Generic (PLEG): container finished" podID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerID="8e30489bcc721c1fa556eeaf5a6475bc2d890769c3c2cf8dbcffc5cc74766ca9" exitCode=0 Jan 31 09:15:24 crc kubenswrapper[4732]: I0131 09:15:24.488235 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl4xf" event={"ID":"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8","Type":"ContainerDied","Data":"8e30489bcc721c1fa556eeaf5a6475bc2d890769c3c2cf8dbcffc5cc74766ca9"} Jan 31 09:15:24 crc kubenswrapper[4732]: I0131 09:15:24.488636 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl4xf" event={"ID":"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8","Type":"ContainerStarted","Data":"8580e16c6c2470255aa863b68b936832583b54119f231ae9e83c3edb8dea81b0"} Jan 31 09:15:24 crc kubenswrapper[4732]: I0131 09:15:24.558901 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk"] Jan 31 09:15:24 crc kubenswrapper[4732]: W0131 09:15:24.569846 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92cd61e3_285c_42b8_b382_b8dde5e934b8.slice/crio-6c2b41fe902287f420696f23117cb8b4be4f6600cd6092ed479c58bf94eca507 WatchSource:0}: Error finding container 6c2b41fe902287f420696f23117cb8b4be4f6600cd6092ed479c58bf94eca507: Status 404 returned error can't find the container with id 6c2b41fe902287f420696f23117cb8b4be4f6600cd6092ed479c58bf94eca507 Jan 31 09:15:24 crc kubenswrapper[4732]: I0131 09:15:24.664546 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-create-qtddl"] Jan 31 09:15:24 crc kubenswrapper[4732]: W0131 09:15:24.669295 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc86b412a_c376_48cd_b724_77e5fb6c9347.slice/crio-09bdbbc06ea85c16156cd5e9709429078749a80e7d7e1398b8779ce739af35fd WatchSource:0}: Error finding container 09bdbbc06ea85c16156cd5e9709429078749a80e7d7e1398b8779ce739af35fd: Status 404 returned error can't find the container with id 09bdbbc06ea85c16156cd5e9709429078749a80e7d7e1398b8779ce739af35fd Jan 31 09:15:25 crc kubenswrapper[4732]: E0131 09:15:25.002540 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92cd61e3_285c_42b8_b382_b8dde5e934b8.slice/crio-c0a354876c3578c412d4a67e39a312dc83add567ea01562dcdd52716d48fc242.scope\": RecentStats: unable to find data in memory cache]" Jan 31 09:15:25 crc kubenswrapper[4732]: I0131 09:15:25.496001 4732 generic.go:334] "Generic (PLEG): container finished" podID="c86b412a-c376-48cd-b724-77e5fb6c9347" containerID="47dfd350e39daf4b8f033d97a0ce4f52541999043f126251274978479fb72c51" exitCode=0 Jan 31 09:15:25 crc kubenswrapper[4732]: I0131 09:15:25.496083 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-qtddl" event={"ID":"c86b412a-c376-48cd-b724-77e5fb6c9347","Type":"ContainerDied","Data":"47dfd350e39daf4b8f033d97a0ce4f52541999043f126251274978479fb72c51"} Jan 31 09:15:25 crc kubenswrapper[4732]: I0131 09:15:25.496116 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-qtddl" event={"ID":"c86b412a-c376-48cd-b724-77e5fb6c9347","Type":"ContainerStarted","Data":"09bdbbc06ea85c16156cd5e9709429078749a80e7d7e1398b8779ce739af35fd"} Jan 31 09:15:25 crc kubenswrapper[4732]: I0131 09:15:25.498572 4732 generic.go:334] "Generic (PLEG): container finished" podID="92cd61e3-285c-42b8-b382-b8dde5e934b8" containerID="c0a354876c3578c412d4a67e39a312dc83add567ea01562dcdd52716d48fc242" exitCode=0 Jan 31 09:15:25 crc kubenswrapper[4732]: I0131 09:15:25.498825 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" event={"ID":"92cd61e3-285c-42b8-b382-b8dde5e934b8","Type":"ContainerDied","Data":"c0a354876c3578c412d4a67e39a312dc83add567ea01562dcdd52716d48fc242"} Jan 31 09:15:25 crc kubenswrapper[4732]: I0131 09:15:25.498932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" event={"ID":"92cd61e3-285c-42b8-b382-b8dde5e934b8","Type":"ContainerStarted","Data":"6c2b41fe902287f420696f23117cb8b4be4f6600cd6092ed479c58bf94eca507"} Jan 31 09:15:25 crc kubenswrapper[4732]: I0131 09:15:25.501856 4732 generic.go:334] "Generic (PLEG): container finished" podID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerID="ba6f3408991a2d77cdc964ae18ce8f88510d43c155703e2ee23d40d6b92c931c" exitCode=0 Jan 31 09:15:25 crc kubenswrapper[4732]: I0131 09:15:25.501888 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl4xf" event={"ID":"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8","Type":"ContainerDied","Data":"ba6f3408991a2d77cdc964ae18ce8f88510d43c155703e2ee23d40d6b92c931c"} Jan 31 09:15:26 crc kubenswrapper[4732]: I0131 09:15:26.511834 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl4xf" event={"ID":"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8","Type":"ContainerStarted","Data":"4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805"} Jan 31 09:15:26 crc kubenswrapper[4732]: I0131 09:15:26.531838 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nl4xf" podStartSLOduration=2.122934314 podStartE2EDuration="3.531814931s" podCreationTimestamp="2026-01-31 09:15:23 +0000 UTC" firstStartedPulling="2026-01-31 09:15:24.490271036 +0000 UTC m=+862.796147240" lastFinishedPulling="2026-01-31 09:15:25.899151653 +0000 UTC m=+864.205027857" observedRunningTime="2026-01-31 09:15:26.53116481 +0000 UTC m=+864.837041024" watchObservedRunningTime="2026-01-31 09:15:26.531814931 +0000 UTC m=+864.837691145" Jan 31 09:15:26 crc kubenswrapper[4732]: I0131 09:15:26.863393 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:26 crc kubenswrapper[4732]: I0131 09:15:26.870420 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.012102 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hdrj\" (UniqueName: \"kubernetes.io/projected/92cd61e3-285c-42b8-b382-b8dde5e934b8-kube-api-access-5hdrj\") pod \"92cd61e3-285c-42b8-b382-b8dde5e934b8\" (UID: \"92cd61e3-285c-42b8-b382-b8dde5e934b8\") " Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.016588 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99jr8\" (UniqueName: \"kubernetes.io/projected/c86b412a-c376-48cd-b724-77e5fb6c9347-kube-api-access-99jr8\") pod \"c86b412a-c376-48cd-b724-77e5fb6c9347\" (UID: \"c86b412a-c376-48cd-b724-77e5fb6c9347\") " Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.016682 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86b412a-c376-48cd-b724-77e5fb6c9347-operator-scripts\") pod \"c86b412a-c376-48cd-b724-77e5fb6c9347\" (UID: \"c86b412a-c376-48cd-b724-77e5fb6c9347\") " Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.016755 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cd61e3-285c-42b8-b382-b8dde5e934b8-operator-scripts\") pod \"92cd61e3-285c-42b8-b382-b8dde5e934b8\" (UID: \"92cd61e3-285c-42b8-b382-b8dde5e934b8\") " Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.017493 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86b412a-c376-48cd-b724-77e5fb6c9347-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c86b412a-c376-48cd-b724-77e5fb6c9347" (UID: "c86b412a-c376-48cd-b724-77e5fb6c9347"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.017524 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92cd61e3-285c-42b8-b382-b8dde5e934b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92cd61e3-285c-42b8-b382-b8dde5e934b8" (UID: "92cd61e3-285c-42b8-b382-b8dde5e934b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.020023 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92cd61e3-285c-42b8-b382-b8dde5e934b8-kube-api-access-5hdrj" (OuterVolumeSpecName: "kube-api-access-5hdrj") pod "92cd61e3-285c-42b8-b382-b8dde5e934b8" (UID: "92cd61e3-285c-42b8-b382-b8dde5e934b8"). InnerVolumeSpecName "kube-api-access-5hdrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.020387 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86b412a-c376-48cd-b724-77e5fb6c9347-kube-api-access-99jr8" (OuterVolumeSpecName: "kube-api-access-99jr8") pod "c86b412a-c376-48cd-b724-77e5fb6c9347" (UID: "c86b412a-c376-48cd-b724-77e5fb6c9347"). InnerVolumeSpecName "kube-api-access-99jr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.118353 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c86b412a-c376-48cd-b724-77e5fb6c9347-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.118385 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cd61e3-285c-42b8-b382-b8dde5e934b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.118395 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hdrj\" (UniqueName: \"kubernetes.io/projected/92cd61e3-285c-42b8-b382-b8dde5e934b8-kube-api-access-5hdrj\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.118404 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99jr8\" (UniqueName: \"kubernetes.io/projected/c86b412a-c376-48cd-b724-77e5fb6c9347-kube-api-access-99jr8\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.520708 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-qtddl" event={"ID":"c86b412a-c376-48cd-b724-77e5fb6c9347","Type":"ContainerDied","Data":"09bdbbc06ea85c16156cd5e9709429078749a80e7d7e1398b8779ce739af35fd"} Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.520770 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09bdbbc06ea85c16156cd5e9709429078749a80e7d7e1398b8779ce739af35fd" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.520735 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-qtddl" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.524097 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" event={"ID":"92cd61e3-285c-42b8-b382-b8dde5e934b8","Type":"ContainerDied","Data":"6c2b41fe902287f420696f23117cb8b4be4f6600cd6092ed479c58bf94eca507"} Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.524162 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c2b41fe902287f420696f23117cb8b4be4f6600cd6092ed479c58bf94eca507" Jan 31 09:15:27 crc kubenswrapper[4732]: I0131 09:15:27.524271 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.253213 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kqk59"] Jan 31 09:15:28 crc kubenswrapper[4732]: E0131 09:15:28.254099 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92cd61e3-285c-42b8-b382-b8dde5e934b8" containerName="mariadb-account-create-update" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.254136 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="92cd61e3-285c-42b8-b382-b8dde5e934b8" containerName="mariadb-account-create-update" Jan 31 09:15:28 crc kubenswrapper[4732]: E0131 09:15:28.254162 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c86b412a-c376-48cd-b724-77e5fb6c9347" containerName="mariadb-database-create" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.254179 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86b412a-c376-48cd-b724-77e5fb6c9347" containerName="mariadb-database-create" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.254448 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c86b412a-c376-48cd-b724-77e5fb6c9347" containerName="mariadb-database-create" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.254479 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="92cd61e3-285c-42b8-b382-b8dde5e934b8" containerName="mariadb-account-create-update" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.256205 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.259545 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqk59"] Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.337436 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5q9\" (UniqueName: \"kubernetes.io/projected/33f21797-eef0-4dce-9f1f-d6b77a951924-kube-api-access-rx5q9\") pod \"redhat-operators-kqk59\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.337559 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-utilities\") pod \"redhat-operators-kqk59\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.337597 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-catalog-content\") pod \"redhat-operators-kqk59\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.438814 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-catalog-content\") pod \"redhat-operators-kqk59\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.439212 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5q9\" (UniqueName: \"kubernetes.io/projected/33f21797-eef0-4dce-9f1f-d6b77a951924-kube-api-access-rx5q9\") pod \"redhat-operators-kqk59\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.439311 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-utilities\") pod \"redhat-operators-kqk59\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.439752 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-utilities\") pod \"redhat-operators-kqk59\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.439762 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-catalog-content\") pod \"redhat-operators-kqk59\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.461223 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5q9\" (UniqueName: \"kubernetes.io/projected/33f21797-eef0-4dce-9f1f-d6b77a951924-kube-api-access-rx5q9\") pod \"redhat-operators-kqk59\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:28 crc kubenswrapper[4732]: I0131 09:15:28.581218 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.037749 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqk59"] Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.429574 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-db-sync-vj5z4"] Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.430563 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.432960 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.433204 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.433300 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.433434 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-2sjhb" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.443632 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-vj5z4"] Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.454305 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rwm\" (UniqueName: \"kubernetes.io/projected/1568a5da-d308-4b7e-94b6-99c846371cb8-kube-api-access-54rwm\") pod \"keystone-db-sync-vj5z4\" (UID: \"1568a5da-d308-4b7e-94b6-99c846371cb8\") " pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.454390 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1568a5da-d308-4b7e-94b6-99c846371cb8-config-data\") pod \"keystone-db-sync-vj5z4\" (UID: \"1568a5da-d308-4b7e-94b6-99c846371cb8\") " pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.541751 4732 generic.go:334] "Generic (PLEG): container finished" podID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerID="cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b" exitCode=0 Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.541801 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqk59" event={"ID":"33f21797-eef0-4dce-9f1f-d6b77a951924","Type":"ContainerDied","Data":"cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b"} Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.541859 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqk59" event={"ID":"33f21797-eef0-4dce-9f1f-d6b77a951924","Type":"ContainerStarted","Data":"360eee913f089b940a4621b7c35dd80948b4548ebcc255ba1cf6e31c1da2b616"} Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.555994 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54rwm\" (UniqueName: \"kubernetes.io/projected/1568a5da-d308-4b7e-94b6-99c846371cb8-kube-api-access-54rwm\") pod \"keystone-db-sync-vj5z4\" (UID: \"1568a5da-d308-4b7e-94b6-99c846371cb8\") " pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.556271 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1568a5da-d308-4b7e-94b6-99c846371cb8-config-data\") pod \"keystone-db-sync-vj5z4\" (UID: \"1568a5da-d308-4b7e-94b6-99c846371cb8\") " pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.573444 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rwm\" (UniqueName: \"kubernetes.io/projected/1568a5da-d308-4b7e-94b6-99c846371cb8-kube-api-access-54rwm\") pod \"keystone-db-sync-vj5z4\" (UID: \"1568a5da-d308-4b7e-94b6-99c846371cb8\") " pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.573460 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1568a5da-d308-4b7e-94b6-99c846371cb8-config-data\") pod \"keystone-db-sync-vj5z4\" (UID: \"1568a5da-d308-4b7e-94b6-99c846371cb8\") " pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:15:29 crc kubenswrapper[4732]: I0131 09:15:29.748565 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:15:30 crc kubenswrapper[4732]: I0131 09:15:30.195961 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-vj5z4"] Jan 31 09:15:30 crc kubenswrapper[4732]: W0131 09:15:30.202056 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1568a5da_d308_4b7e_94b6_99c846371cb8.slice/crio-c31583a1029ffae07fced86c20dea255707864cf3d8bba913ecc7e7f43951a8d WatchSource:0}: Error finding container c31583a1029ffae07fced86c20dea255707864cf3d8bba913ecc7e7f43951a8d: Status 404 returned error can't find the container with id c31583a1029ffae07fced86c20dea255707864cf3d8bba913ecc7e7f43951a8d Jan 31 09:15:30 crc kubenswrapper[4732]: I0131 09:15:30.573462 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" event={"ID":"1568a5da-d308-4b7e-94b6-99c846371cb8","Type":"ContainerStarted","Data":"c31583a1029ffae07fced86c20dea255707864cf3d8bba913ecc7e7f43951a8d"} Jan 31 09:15:30 crc kubenswrapper[4732]: I0131 09:15:30.575713 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqk59" event={"ID":"33f21797-eef0-4dce-9f1f-d6b77a951924","Type":"ContainerStarted","Data":"44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c"} Jan 31 09:15:31 crc kubenswrapper[4732]: I0131 09:15:31.585621 4732 generic.go:334] "Generic (PLEG): container finished" podID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerID="44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c" exitCode=0 Jan 31 09:15:31 crc kubenswrapper[4732]: I0131 09:15:31.585739 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqk59" event={"ID":"33f21797-eef0-4dce-9f1f-d6b77a951924","Type":"ContainerDied","Data":"44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c"} Jan 31 09:15:32 crc kubenswrapper[4732]: I0131 09:15:32.382749 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:15:32 crc kubenswrapper[4732]: I0131 09:15:32.383063 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:15:32 crc kubenswrapper[4732]: I0131 09:15:32.415201 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:15:32 crc kubenswrapper[4732]: I0131 09:15:32.601599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqk59" event={"ID":"33f21797-eef0-4dce-9f1f-d6b77a951924","Type":"ContainerStarted","Data":"3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5"} Jan 31 09:15:32 crc kubenswrapper[4732]: I0131 09:15:32.620993 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kqk59" podStartSLOduration=2.204577098 podStartE2EDuration="4.620978514s" podCreationTimestamp="2026-01-31 09:15:28 +0000 UTC" firstStartedPulling="2026-01-31 09:15:29.543417559 +0000 UTC m=+867.849293763" lastFinishedPulling="2026-01-31 09:15:31.959818975 +0000 UTC m=+870.265695179" observedRunningTime="2026-01-31 09:15:32.616773511 +0000 UTC m=+870.922649705" watchObservedRunningTime="2026-01-31 09:15:32.620978514 +0000 UTC m=+870.926854718" Jan 31 09:15:32 crc kubenswrapper[4732]: I0131 09:15:32.632979 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:15:33 crc kubenswrapper[4732]: I0131 09:15:33.578200 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:33 crc kubenswrapper[4732]: I0131 09:15:33.578255 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:33 crc kubenswrapper[4732]: I0131 09:15:33.633386 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:33 crc kubenswrapper[4732]: I0131 09:15:33.684602 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.506706 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln"] Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.508229 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.509852 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tnztr" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.514182 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln"] Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.561087 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-util\") pod \"55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.561135 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c7mb\" (UniqueName: \"kubernetes.io/projected/5a9d87a5-c953-483d-8183-0a0b8d4abac9-kube-api-access-4c7mb\") pod \"55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.561160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-bundle\") pod \"55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.662200 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-util\") pod \"55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.662244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c7mb\" (UniqueName: \"kubernetes.io/projected/5a9d87a5-c953-483d-8183-0a0b8d4abac9-kube-api-access-4c7mb\") pod \"55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.662274 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-bundle\") pod \"55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.662897 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-bundle\") pod \"55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.663361 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-util\") pod \"55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.680534 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c7mb\" (UniqueName: \"kubernetes.io/projected/5a9d87a5-c953-483d-8183-0a0b8d4abac9-kube-api-access-4c7mb\") pod \"55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:36 crc kubenswrapper[4732]: I0131 09:15:36.832477 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:38 crc kubenswrapper[4732]: I0131 09:15:38.435870 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl4xf"] Jan 31 09:15:38 crc kubenswrapper[4732]: I0131 09:15:38.436375 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nl4xf" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerName="registry-server" containerID="cri-o://4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805" gracePeriod=2 Jan 31 09:15:38 crc kubenswrapper[4732]: I0131 09:15:38.581785 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:38 crc kubenswrapper[4732]: I0131 09:15:38.581833 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:38 crc kubenswrapper[4732]: I0131 09:15:38.632729 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:38 crc kubenswrapper[4732]: I0131 09:15:38.709139 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:40 crc kubenswrapper[4732]: I0131 09:15:40.671828 4732 generic.go:334] "Generic (PLEG): container finished" podID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerID="4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805" exitCode=0 Jan 31 09:15:40 crc kubenswrapper[4732]: I0131 09:15:40.671884 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl4xf" event={"ID":"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8","Type":"ContainerDied","Data":"4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805"} Jan 31 09:15:43 crc kubenswrapper[4732]: E0131 09:15:43.578322 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805 is running failed: container process not found" containerID="4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:15:43 crc kubenswrapper[4732]: E0131 09:15:43.579368 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805 is running failed: container process not found" containerID="4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:15:43 crc kubenswrapper[4732]: E0131 09:15:43.580354 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805 is running failed: container process not found" containerID="4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:15:43 crc kubenswrapper[4732]: E0131 09:15:43.580385 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-nl4xf" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerName="registry-server" Jan 31 09:15:43 crc kubenswrapper[4732]: I0131 09:15:43.635845 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kqk59"] Jan 31 09:15:43 crc kubenswrapper[4732]: I0131 09:15:43.636071 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kqk59" podUID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerName="registry-server" containerID="cri-o://3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5" gracePeriod=2 Jan 31 09:15:43 crc kubenswrapper[4732]: E0131 09:15:43.803745 4732 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-keystone:current-podified" Jan 31 09:15:43 crc kubenswrapper[4732]: E0131 09:15:43.803928 4732 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:keystone-db-sync,Image:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,Command:[/bin/bash],Args:[-c keystone-manage db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/keystone/keystone.conf,SubPath:keystone.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54rwm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42425,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42425,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-db-sync-vj5z4_swift-kuttl-tests(1568a5da-d308-4b7e-94b6-99c846371cb8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 31 09:15:43 crc kubenswrapper[4732]: E0131 09:15:43.805885 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" podUID="1568a5da-d308-4b7e-94b6-99c846371cb8" Jan 31 09:15:43 crc kubenswrapper[4732]: I0131 09:15:43.876700 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:43 crc kubenswrapper[4732]: I0131 09:15:43.962767 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-catalog-content\") pod \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " Jan 31 09:15:43 crc kubenswrapper[4732]: I0131 09:15:43.962862 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-utilities\") pod \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " Jan 31 09:15:43 crc kubenswrapper[4732]: I0131 09:15:43.962905 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7wrv\" (UniqueName: \"kubernetes.io/projected/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-kube-api-access-r7wrv\") pod \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\" (UID: \"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8\") " Jan 31 09:15:43 crc kubenswrapper[4732]: I0131 09:15:43.963896 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-utilities" (OuterVolumeSpecName: "utilities") pod "2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" (UID: "2febcab9-3048-4ea6-bd0d-ce40bf6bcda8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:15:43 crc kubenswrapper[4732]: I0131 09:15:43.978687 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-kube-api-access-r7wrv" (OuterVolumeSpecName: "kube-api-access-r7wrv") pod "2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" (UID: "2febcab9-3048-4ea6-bd0d-ce40bf6bcda8"). InnerVolumeSpecName "kube-api-access-r7wrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:43 crc kubenswrapper[4732]: I0131 09:15:43.990550 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" (UID: "2febcab9-3048-4ea6-bd0d-ce40bf6bcda8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.051574 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.064690 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.064731 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.064746 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7wrv\" (UniqueName: \"kubernetes.io/projected/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8-kube-api-access-r7wrv\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.165913 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-catalog-content\") pod \"33f21797-eef0-4dce-9f1f-d6b77a951924\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.166002 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx5q9\" (UniqueName: \"kubernetes.io/projected/33f21797-eef0-4dce-9f1f-d6b77a951924-kube-api-access-rx5q9\") pod \"33f21797-eef0-4dce-9f1f-d6b77a951924\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.166151 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-utilities\") pod \"33f21797-eef0-4dce-9f1f-d6b77a951924\" (UID: \"33f21797-eef0-4dce-9f1f-d6b77a951924\") " Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.167587 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-utilities" (OuterVolumeSpecName: "utilities") pod "33f21797-eef0-4dce-9f1f-d6b77a951924" (UID: "33f21797-eef0-4dce-9f1f-d6b77a951924"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.169814 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f21797-eef0-4dce-9f1f-d6b77a951924-kube-api-access-rx5q9" (OuterVolumeSpecName: "kube-api-access-rx5q9") pod "33f21797-eef0-4dce-9f1f-d6b77a951924" (UID: "33f21797-eef0-4dce-9f1f-d6b77a951924"). InnerVolumeSpecName "kube-api-access-rx5q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.211987 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln"] Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.268063 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.268332 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx5q9\" (UniqueName: \"kubernetes.io/projected/33f21797-eef0-4dce-9f1f-d6b77a951924-kube-api-access-rx5q9\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.296137 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33f21797-eef0-4dce-9f1f-d6b77a951924" (UID: "33f21797-eef0-4dce-9f1f-d6b77a951924"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.369912 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f21797-eef0-4dce-9f1f-d6b77a951924-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.705040 4732 generic.go:334] "Generic (PLEG): container finished" podID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerID="3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5" exitCode=0 Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.705073 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqk59" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.705122 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqk59" event={"ID":"33f21797-eef0-4dce-9f1f-d6b77a951924","Type":"ContainerDied","Data":"3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5"} Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.705174 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqk59" event={"ID":"33f21797-eef0-4dce-9f1f-d6b77a951924","Type":"ContainerDied","Data":"360eee913f089b940a4621b7c35dd80948b4548ebcc255ba1cf6e31c1da2b616"} Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.705204 4732 scope.go:117] "RemoveContainer" containerID="3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.708948 4732 generic.go:334] "Generic (PLEG): container finished" podID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerID="443c8d984e9ef1e7b308f390e27b580b627477703c6492f50cd0b13f5a986a0a" exitCode=0 Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.709030 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" event={"ID":"5a9d87a5-c953-483d-8183-0a0b8d4abac9","Type":"ContainerDied","Data":"443c8d984e9ef1e7b308f390e27b580b627477703c6492f50cd0b13f5a986a0a"} Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.709061 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" event={"ID":"5a9d87a5-c953-483d-8183-0a0b8d4abac9","Type":"ContainerStarted","Data":"71758a9cca71bb6fba20a483968dd69a7a009aafd37b86b3a13357deef9131c3"} Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.711094 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.711925 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nl4xf" event={"ID":"2febcab9-3048-4ea6-bd0d-ce40bf6bcda8","Type":"ContainerDied","Data":"8580e16c6c2470255aa863b68b936832583b54119f231ae9e83c3edb8dea81b0"} Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.711970 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nl4xf" Jan 31 09:15:44 crc kubenswrapper[4732]: E0131 09:15:44.712806 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"keystone-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-keystone:current-podified\\\"\"" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" podUID="1568a5da-d308-4b7e-94b6-99c846371cb8" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.737207 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kqk59"] Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.742298 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kqk59"] Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.747515 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl4xf"] Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.751224 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nl4xf"] Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.753948 4732 scope.go:117] "RemoveContainer" containerID="44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.802638 4732 scope.go:117] "RemoveContainer" containerID="cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.828556 4732 scope.go:117] "RemoveContainer" containerID="3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5" Jan 31 09:15:44 crc kubenswrapper[4732]: E0131 09:15:44.829001 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5\": container with ID starting with 3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5 not found: ID does not exist" containerID="3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.829038 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5"} err="failed to get container status \"3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5\": rpc error: code = NotFound desc = could not find container \"3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5\": container with ID starting with 3dae847f1a2ac1d08ed7c6d9e699dd9911b0eb9ca79ac122de91849bbe5f5bc5 not found: ID does not exist" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.829092 4732 scope.go:117] "RemoveContainer" containerID="44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c" Jan 31 09:15:44 crc kubenswrapper[4732]: E0131 09:15:44.829370 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c\": container with ID starting with 44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c not found: ID does not exist" containerID="44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.829435 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c"} err="failed to get container status \"44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c\": rpc error: code = NotFound desc = could not find container \"44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c\": container with ID starting with 44f6676563cf161634b19b4f4bde8ef7a679461edd8877ff6405d99671b94d1c not found: ID does not exist" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.829454 4732 scope.go:117] "RemoveContainer" containerID="cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b" Jan 31 09:15:44 crc kubenswrapper[4732]: E0131 09:15:44.829875 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b\": container with ID starting with cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b not found: ID does not exist" containerID="cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.829923 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b"} err="failed to get container status \"cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b\": rpc error: code = NotFound desc = could not find container \"cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b\": container with ID starting with cc766b5a0a82cd3ceb39ba7e9061b34b32c68752de623072e40d2f503469764b not found: ID does not exist" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.829941 4732 scope.go:117] "RemoveContainer" containerID="4ce0200273326beff1a4d10a7e6f6fa7515641b6637093b2d0bc9e98a2455805" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.859751 4732 scope.go:117] "RemoveContainer" containerID="ba6f3408991a2d77cdc964ae18ce8f88510d43c155703e2ee23d40d6b92c931c" Jan 31 09:15:44 crc kubenswrapper[4732]: I0131 09:15:44.877168 4732 scope.go:117] "RemoveContainer" containerID="8e30489bcc721c1fa556eeaf5a6475bc2d890769c3c2cf8dbcffc5cc74766ca9" Jan 31 09:15:45 crc kubenswrapper[4732]: I0131 09:15:45.721966 4732 generic.go:334] "Generic (PLEG): container finished" podID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerID="f344ba5284e12746a74007d5adc8aa42e8ee48a5c3588cb47db5117f3c83e9f0" exitCode=0 Jan 31 09:15:45 crc kubenswrapper[4732]: I0131 09:15:45.722011 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" event={"ID":"5a9d87a5-c953-483d-8183-0a0b8d4abac9","Type":"ContainerDied","Data":"f344ba5284e12746a74007d5adc8aa42e8ee48a5c3588cb47db5117f3c83e9f0"} Jan 31 09:15:46 crc kubenswrapper[4732]: I0131 09:15:46.558327 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" path="/var/lib/kubelet/pods/2febcab9-3048-4ea6-bd0d-ce40bf6bcda8/volumes" Jan 31 09:15:46 crc kubenswrapper[4732]: I0131 09:15:46.560287 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f21797-eef0-4dce-9f1f-d6b77a951924" path="/var/lib/kubelet/pods/33f21797-eef0-4dce-9f1f-d6b77a951924/volumes" Jan 31 09:15:46 crc kubenswrapper[4732]: I0131 09:15:46.739197 4732 generic.go:334] "Generic (PLEG): container finished" podID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerID="a5bd2e1a25f92dbcf8d8999d3428639a204c1c32490db7803f3573205b07d825" exitCode=0 Jan 31 09:15:46 crc kubenswrapper[4732]: I0131 09:15:46.739237 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" event={"ID":"5a9d87a5-c953-483d-8183-0a0b8d4abac9","Type":"ContainerDied","Data":"a5bd2e1a25f92dbcf8d8999d3428639a204c1c32490db7803f3573205b07d825"} Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.076045 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.124193 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c7mb\" (UniqueName: \"kubernetes.io/projected/5a9d87a5-c953-483d-8183-0a0b8d4abac9-kube-api-access-4c7mb\") pod \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.124282 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-util\") pod \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.124344 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-bundle\") pod \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\" (UID: \"5a9d87a5-c953-483d-8183-0a0b8d4abac9\") " Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.125526 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-bundle" (OuterVolumeSpecName: "bundle") pod "5a9d87a5-c953-483d-8183-0a0b8d4abac9" (UID: "5a9d87a5-c953-483d-8183-0a0b8d4abac9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.134057 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9d87a5-c953-483d-8183-0a0b8d4abac9-kube-api-access-4c7mb" (OuterVolumeSpecName: "kube-api-access-4c7mb") pod "5a9d87a5-c953-483d-8183-0a0b8d4abac9" (UID: "5a9d87a5-c953-483d-8183-0a0b8d4abac9"). InnerVolumeSpecName "kube-api-access-4c7mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.159863 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-util" (OuterVolumeSpecName: "util") pod "5a9d87a5-c953-483d-8183-0a0b8d4abac9" (UID: "5a9d87a5-c953-483d-8183-0a0b8d4abac9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.225951 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c7mb\" (UniqueName: \"kubernetes.io/projected/5a9d87a5-c953-483d-8183-0a0b8d4abac9-kube-api-access-4c7mb\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.226023 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.226036 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a9d87a5-c953-483d-8183-0a0b8d4abac9-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.758850 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" event={"ID":"5a9d87a5-c953-483d-8183-0a0b8d4abac9","Type":"ContainerDied","Data":"71758a9cca71bb6fba20a483968dd69a7a009aafd37b86b3a13357deef9131c3"} Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.759262 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71758a9cca71bb6fba20a483968dd69a7a009aafd37b86b3a13357deef9131c3" Jan 31 09:15:48 crc kubenswrapper[4732]: I0131 09:15:48.758956 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.604268 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c"] Jan 31 09:15:59 crc kubenswrapper[4732]: E0131 09:15:59.604962 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerName="registry-server" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.604974 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerName="registry-server" Jan 31 09:15:59 crc kubenswrapper[4732]: E0131 09:15:59.604984 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerName="registry-server" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.604990 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerName="registry-server" Jan 31 09:15:59 crc kubenswrapper[4732]: E0131 09:15:59.605000 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerName="extract-content" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605006 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerName="extract-content" Jan 31 09:15:59 crc kubenswrapper[4732]: E0131 09:15:59.605014 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerName="pull" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605021 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerName="pull" Jan 31 09:15:59 crc kubenswrapper[4732]: E0131 09:15:59.605029 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerName="extract" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605034 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerName="extract" Jan 31 09:15:59 crc kubenswrapper[4732]: E0131 09:15:59.605046 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerName="util" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605052 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerName="util" Jan 31 09:15:59 crc kubenswrapper[4732]: E0131 09:15:59.605058 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerName="extract-content" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605065 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerName="extract-content" Jan 31 09:15:59 crc kubenswrapper[4732]: E0131 09:15:59.605073 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerName="extract-utilities" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605080 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerName="extract-utilities" Jan 31 09:15:59 crc kubenswrapper[4732]: E0131 09:15:59.605087 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerName="extract-utilities" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605092 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerName="extract-utilities" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605194 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2febcab9-3048-4ea6-bd0d-ce40bf6bcda8" containerName="registry-server" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605205 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" containerName="extract" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605215 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f21797-eef0-4dce-9f1f-d6b77a951924" containerName="registry-server" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.605613 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.608830 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-service-cert" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.609638 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-k8l4b" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.620506 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c"] Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.794639 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-apiservice-cert\") pod \"barbican-operator-controller-manager-665b569d9f-rjs9c\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.794877 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwr5w\" (UniqueName: \"kubernetes.io/projected/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-kube-api-access-vwr5w\") pod \"barbican-operator-controller-manager-665b569d9f-rjs9c\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.794967 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-webhook-cert\") pod \"barbican-operator-controller-manager-665b569d9f-rjs9c\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.841872 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" event={"ID":"1568a5da-d308-4b7e-94b6-99c846371cb8","Type":"ContainerStarted","Data":"c9325e278cfd071d34b0338c159d59fb628048d898065c645b245c3766ab28e7"} Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.859935 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" podStartSLOduration=2.086817483 podStartE2EDuration="30.859893374s" podCreationTimestamp="2026-01-31 09:15:29 +0000 UTC" firstStartedPulling="2026-01-31 09:15:30.204511756 +0000 UTC m=+868.510387960" lastFinishedPulling="2026-01-31 09:15:58.977587647 +0000 UTC m=+897.283463851" observedRunningTime="2026-01-31 09:15:59.856998192 +0000 UTC m=+898.162874416" watchObservedRunningTime="2026-01-31 09:15:59.859893374 +0000 UTC m=+898.165769588" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.909700 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-webhook-cert\") pod \"barbican-operator-controller-manager-665b569d9f-rjs9c\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.909845 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-apiservice-cert\") pod \"barbican-operator-controller-manager-665b569d9f-rjs9c\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.910058 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwr5w\" (UniqueName: \"kubernetes.io/projected/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-kube-api-access-vwr5w\") pod \"barbican-operator-controller-manager-665b569d9f-rjs9c\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.916654 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-apiservice-cert\") pod \"barbican-operator-controller-manager-665b569d9f-rjs9c\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.921116 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-webhook-cert\") pod \"barbican-operator-controller-manager-665b569d9f-rjs9c\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:15:59 crc kubenswrapper[4732]: I0131 09:15:59.932101 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwr5w\" (UniqueName: \"kubernetes.io/projected/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-kube-api-access-vwr5w\") pod \"barbican-operator-controller-manager-665b569d9f-rjs9c\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:16:00 crc kubenswrapper[4732]: I0131 09:16:00.223561 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:16:00 crc kubenswrapper[4732]: I0131 09:16:00.675247 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c"] Jan 31 09:16:00 crc kubenswrapper[4732]: I0131 09:16:00.847719 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" event={"ID":"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b","Type":"ContainerStarted","Data":"f54629922c72c3da7a2c23ec9c364e1132ebace0c630660cc9bfe34f06a31d4f"} Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.246745 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6qdc6"] Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.248574 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.256324 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qdc6"] Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.380150 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7wmh\" (UniqueName: \"kubernetes.io/projected/5d0de300-63ee-4f40-9750-7eb7d5d10466-kube-api-access-n7wmh\") pod \"certified-operators-6qdc6\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.380228 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-utilities\") pod \"certified-operators-6qdc6\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.380352 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-catalog-content\") pod \"certified-operators-6qdc6\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.482253 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7wmh\" (UniqueName: \"kubernetes.io/projected/5d0de300-63ee-4f40-9750-7eb7d5d10466-kube-api-access-n7wmh\") pod \"certified-operators-6qdc6\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.482347 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-utilities\") pod \"certified-operators-6qdc6\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.482384 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-catalog-content\") pod \"certified-operators-6qdc6\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.483121 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-catalog-content\") pod \"certified-operators-6qdc6\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.485791 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-utilities\") pod \"certified-operators-6qdc6\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.504012 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7wmh\" (UniqueName: \"kubernetes.io/projected/5d0de300-63ee-4f40-9750-7eb7d5d10466-kube-api-access-n7wmh\") pod \"certified-operators-6qdc6\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:03 crc kubenswrapper[4732]: I0131 09:16:03.569024 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:04 crc kubenswrapper[4732]: I0131 09:16:04.296401 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6qdc6"] Jan 31 09:16:04 crc kubenswrapper[4732]: I0131 09:16:04.875999 4732 generic.go:334] "Generic (PLEG): container finished" podID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerID="1fa43d1eeb8c60e26dee8d78abfd035753a71d50bc43bc0b9b3bff5eca6e2540" exitCode=0 Jan 31 09:16:04 crc kubenswrapper[4732]: I0131 09:16:04.876044 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qdc6" event={"ID":"5d0de300-63ee-4f40-9750-7eb7d5d10466","Type":"ContainerDied","Data":"1fa43d1eeb8c60e26dee8d78abfd035753a71d50bc43bc0b9b3bff5eca6e2540"} Jan 31 09:16:04 crc kubenswrapper[4732]: I0131 09:16:04.876384 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qdc6" event={"ID":"5d0de300-63ee-4f40-9750-7eb7d5d10466","Type":"ContainerStarted","Data":"a14dfd56c323ab432efc86e11648711e57a20ae7c23a10d2f26d38d578ad14dc"} Jan 31 09:16:04 crc kubenswrapper[4732]: I0131 09:16:04.879343 4732 generic.go:334] "Generic (PLEG): container finished" podID="1568a5da-d308-4b7e-94b6-99c846371cb8" containerID="c9325e278cfd071d34b0338c159d59fb628048d898065c645b245c3766ab28e7" exitCode=0 Jan 31 09:16:04 crc kubenswrapper[4732]: I0131 09:16:04.879390 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" event={"ID":"1568a5da-d308-4b7e-94b6-99c846371cb8","Type":"ContainerDied","Data":"c9325e278cfd071d34b0338c159d59fb628048d898065c645b245c3766ab28e7"} Jan 31 09:16:04 crc kubenswrapper[4732]: I0131 09:16:04.881141 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" event={"ID":"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b","Type":"ContainerStarted","Data":"613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1"} Jan 31 09:16:04 crc kubenswrapper[4732]: I0131 09:16:04.881294 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:16:04 crc kubenswrapper[4732]: I0131 09:16:04.930194 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" podStartSLOduration=2.802392515 podStartE2EDuration="5.93017462s" podCreationTimestamp="2026-01-31 09:15:59 +0000 UTC" firstStartedPulling="2026-01-31 09:16:00.708012768 +0000 UTC m=+899.013888972" lastFinishedPulling="2026-01-31 09:16:03.835794873 +0000 UTC m=+902.141671077" observedRunningTime="2026-01-31 09:16:04.926806963 +0000 UTC m=+903.232683167" watchObservedRunningTime="2026-01-31 09:16:04.93017462 +0000 UTC m=+903.236050824" Jan 31 09:16:05 crc kubenswrapper[4732]: I0131 09:16:05.889064 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qdc6" event={"ID":"5d0de300-63ee-4f40-9750-7eb7d5d10466","Type":"ContainerStarted","Data":"a84f6527d8e718a0c39029fc384c436e9dc135176afd05bca86ac5610704533f"} Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.166241 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.219310 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54rwm\" (UniqueName: \"kubernetes.io/projected/1568a5da-d308-4b7e-94b6-99c846371cb8-kube-api-access-54rwm\") pod \"1568a5da-d308-4b7e-94b6-99c846371cb8\" (UID: \"1568a5da-d308-4b7e-94b6-99c846371cb8\") " Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.219400 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1568a5da-d308-4b7e-94b6-99c846371cb8-config-data\") pod \"1568a5da-d308-4b7e-94b6-99c846371cb8\" (UID: \"1568a5da-d308-4b7e-94b6-99c846371cb8\") " Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.226314 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1568a5da-d308-4b7e-94b6-99c846371cb8-kube-api-access-54rwm" (OuterVolumeSpecName: "kube-api-access-54rwm") pod "1568a5da-d308-4b7e-94b6-99c846371cb8" (UID: "1568a5da-d308-4b7e-94b6-99c846371cb8"). InnerVolumeSpecName "kube-api-access-54rwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.249631 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1568a5da-d308-4b7e-94b6-99c846371cb8-config-data" (OuterVolumeSpecName: "config-data") pod "1568a5da-d308-4b7e-94b6-99c846371cb8" (UID: "1568a5da-d308-4b7e-94b6-99c846371cb8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.321268 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54rwm\" (UniqueName: \"kubernetes.io/projected/1568a5da-d308-4b7e-94b6-99c846371cb8-kube-api-access-54rwm\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.321312 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1568a5da-d308-4b7e-94b6-99c846371cb8-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.901606 4732 generic.go:334] "Generic (PLEG): container finished" podID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerID="a84f6527d8e718a0c39029fc384c436e9dc135176afd05bca86ac5610704533f" exitCode=0 Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.901691 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qdc6" event={"ID":"5d0de300-63ee-4f40-9750-7eb7d5d10466","Type":"ContainerDied","Data":"a84f6527d8e718a0c39029fc384c436e9dc135176afd05bca86ac5610704533f"} Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.904943 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" event={"ID":"1568a5da-d308-4b7e-94b6-99c846371cb8","Type":"ContainerDied","Data":"c31583a1029ffae07fced86c20dea255707864cf3d8bba913ecc7e7f43951a8d"} Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.904987 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c31583a1029ffae07fced86c20dea255707864cf3d8bba913ecc7e7f43951a8d" Jan 31 09:16:06 crc kubenswrapper[4732]: I0131 09:16:06.905036 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-vj5z4" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.103165 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-r757b"] Jan 31 09:16:07 crc kubenswrapper[4732]: E0131 09:16:07.103701 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1568a5da-d308-4b7e-94b6-99c846371cb8" containerName="keystone-db-sync" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.103717 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1568a5da-d308-4b7e-94b6-99c846371cb8" containerName="keystone-db-sync" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.103834 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1568a5da-d308-4b7e-94b6-99c846371cb8" containerName="keystone-db-sync" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.104248 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.105804 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"osp-secret" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.106358 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.106443 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.106618 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.107190 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-2sjhb" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.113226 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-r757b"] Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.234636 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-fernet-keys\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.234707 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-config-data\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.234732 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7dm2\" (UniqueName: \"kubernetes.io/projected/65a47d3d-88f0-48b4-b672-9b224ead785f-kube-api-access-s7dm2\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.234856 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-scripts\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.234897 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-credential-keys\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.336465 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-scripts\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.336527 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-credential-keys\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.336561 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-fernet-keys\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.336578 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-config-data\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.336601 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7dm2\" (UniqueName: \"kubernetes.io/projected/65a47d3d-88f0-48b4-b672-9b224ead785f-kube-api-access-s7dm2\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.342447 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-credential-keys\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.344138 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-fernet-keys\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.352070 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-scripts\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.352491 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-config-data\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.357322 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7dm2\" (UniqueName: \"kubernetes.io/projected/65a47d3d-88f0-48b4-b672-9b224ead785f-kube-api-access-s7dm2\") pod \"keystone-bootstrap-r757b\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.428339 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.869472 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-r757b"] Jan 31 09:16:07 crc kubenswrapper[4732]: W0131 09:16:07.888949 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65a47d3d_88f0_48b4_b672_9b224ead785f.slice/crio-2972059b4750f41cbe4b42855dfe50fb8a59b3da0d14b490d2a8c37cd4ca4736 WatchSource:0}: Error finding container 2972059b4750f41cbe4b42855dfe50fb8a59b3da0d14b490d2a8c37cd4ca4736: Status 404 returned error can't find the container with id 2972059b4750f41cbe4b42855dfe50fb8a59b3da0d14b490d2a8c37cd4ca4736 Jan 31 09:16:07 crc kubenswrapper[4732]: I0131 09:16:07.947897 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-r757b" event={"ID":"65a47d3d-88f0-48b4-b672-9b224ead785f","Type":"ContainerStarted","Data":"2972059b4750f41cbe4b42855dfe50fb8a59b3da0d14b490d2a8c37cd4ca4736"} Jan 31 09:16:08 crc kubenswrapper[4732]: I0131 09:16:08.959453 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qdc6" event={"ID":"5d0de300-63ee-4f40-9750-7eb7d5d10466","Type":"ContainerStarted","Data":"ee0c5fb66ad6b1e6b4f42d6ec9cbbd6de131ce1fe2d36272c309841e4eaa9b41"} Jan 31 09:16:08 crc kubenswrapper[4732]: I0131 09:16:08.962466 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-r757b" event={"ID":"65a47d3d-88f0-48b4-b672-9b224ead785f","Type":"ContainerStarted","Data":"09dedf44a2244d0aa2f52cb5add6cbbb78a1014d9997dba9dbd36b2416c85acf"} Jan 31 09:16:08 crc kubenswrapper[4732]: I0131 09:16:08.981617 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6qdc6" podStartSLOduration=3.042276297 podStartE2EDuration="5.981602288s" podCreationTimestamp="2026-01-31 09:16:03 +0000 UTC" firstStartedPulling="2026-01-31 09:16:04.877454952 +0000 UTC m=+903.183331166" lastFinishedPulling="2026-01-31 09:16:07.816780953 +0000 UTC m=+906.122657157" observedRunningTime="2026-01-31 09:16:08.978711047 +0000 UTC m=+907.284587261" watchObservedRunningTime="2026-01-31 09:16:08.981602288 +0000 UTC m=+907.287478492" Jan 31 09:16:08 crc kubenswrapper[4732]: I0131 09:16:08.999873 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-bootstrap-r757b" podStartSLOduration=1.999857046 podStartE2EDuration="1.999857046s" podCreationTimestamp="2026-01-31 09:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:16:08.994038921 +0000 UTC m=+907.299915125" watchObservedRunningTime="2026-01-31 09:16:08.999857046 +0000 UTC m=+907.305733250" Jan 31 09:16:10 crc kubenswrapper[4732]: I0131 09:16:10.230703 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.645336 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b4r9l"] Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.646949 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.660520 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b4r9l"] Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.724316 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd27r\" (UniqueName: \"kubernetes.io/projected/30d15a7c-e3f9-4280-b0d3-39264c464abe-kube-api-access-dd27r\") pod \"community-operators-b4r9l\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.724443 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-utilities\") pod \"community-operators-b4r9l\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.724506 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-catalog-content\") pod \"community-operators-b4r9l\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.826282 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-utilities\") pod \"community-operators-b4r9l\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.826582 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-catalog-content\") pod \"community-operators-b4r9l\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.826612 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd27r\" (UniqueName: \"kubernetes.io/projected/30d15a7c-e3f9-4280-b0d3-39264c464abe-kube-api-access-dd27r\") pod \"community-operators-b4r9l\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.826923 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-utilities\") pod \"community-operators-b4r9l\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.827182 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-catalog-content\") pod \"community-operators-b4r9l\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.847965 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd27r\" (UniqueName: \"kubernetes.io/projected/30d15a7c-e3f9-4280-b0d3-39264c464abe-kube-api-access-dd27r\") pod \"community-operators-b4r9l\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:11 crc kubenswrapper[4732]: I0131 09:16:11.967117 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:12 crc kubenswrapper[4732]: I0131 09:16:12.438173 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b4r9l"] Jan 31 09:16:12 crc kubenswrapper[4732]: I0131 09:16:12.991641 4732 generic.go:334] "Generic (PLEG): container finished" podID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerID="cc7bdcfd3c4cf42fbbd1013cc743457dedaafba9028f44513b712048f986520e" exitCode=0 Jan 31 09:16:12 crc kubenswrapper[4732]: I0131 09:16:12.991703 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4r9l" event={"ID":"30d15a7c-e3f9-4280-b0d3-39264c464abe","Type":"ContainerDied","Data":"cc7bdcfd3c4cf42fbbd1013cc743457dedaafba9028f44513b712048f986520e"} Jan 31 09:16:12 crc kubenswrapper[4732]: I0131 09:16:12.991969 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4r9l" event={"ID":"30d15a7c-e3f9-4280-b0d3-39264c464abe","Type":"ContainerStarted","Data":"d41d176201ea47fe3677b834f0031fb82b32fbeaf46af88d3969bcb67324887b"} Jan 31 09:16:13 crc kubenswrapper[4732]: I0131 09:16:13.569632 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:13 crc kubenswrapper[4732]: I0131 09:16:13.569687 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:13 crc kubenswrapper[4732]: I0131 09:16:13.679896 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:14 crc kubenswrapper[4732]: I0131 09:16:13.999851 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4r9l" event={"ID":"30d15a7c-e3f9-4280-b0d3-39264c464abe","Type":"ContainerStarted","Data":"f8fa3bcb0182e184d81f9b95c68a6ba376603e25fdaca53ae4f6c19daf283637"} Jan 31 09:16:14 crc kubenswrapper[4732]: I0131 09:16:14.041266 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.010004 4732 generic.go:334] "Generic (PLEG): container finished" podID="65a47d3d-88f0-48b4-b672-9b224ead785f" containerID="09dedf44a2244d0aa2f52cb5add6cbbb78a1014d9997dba9dbd36b2416c85acf" exitCode=0 Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.010094 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-r757b" event={"ID":"65a47d3d-88f0-48b4-b672-9b224ead785f","Type":"ContainerDied","Data":"09dedf44a2244d0aa2f52cb5add6cbbb78a1014d9997dba9dbd36b2416c85acf"} Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.013790 4732 generic.go:334] "Generic (PLEG): container finished" podID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerID="f8fa3bcb0182e184d81f9b95c68a6ba376603e25fdaca53ae4f6c19daf283637" exitCode=0 Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.013902 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4r9l" event={"ID":"30d15a7c-e3f9-4280-b0d3-39264c464abe","Type":"ContainerDied","Data":"f8fa3bcb0182e184d81f9b95c68a6ba376603e25fdaca53ae4f6c19daf283637"} Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.835616 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-db-create-hffnv"] Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.836913 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.843850 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-f734-account-create-update-wkhs4"] Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.844592 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.854719 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-db-secret" Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.864167 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-create-hffnv"] Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.883823 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-f734-account-create-update-wkhs4"] Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.992192 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-operator-scripts\") pod \"barbican-db-create-hffnv\" (UID: \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\") " pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.992788 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf9e4683-b288-4b28-9b9b-504461c55a4e-operator-scripts\") pod \"barbican-f734-account-create-update-wkhs4\" (UID: \"bf9e4683-b288-4b28-9b9b-504461c55a4e\") " pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.992943 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb9kr\" (UniqueName: \"kubernetes.io/projected/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-kube-api-access-pb9kr\") pod \"barbican-db-create-hffnv\" (UID: \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\") " pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:15 crc kubenswrapper[4732]: I0131 09:16:15.992984 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpxkb\" (UniqueName: \"kubernetes.io/projected/bf9e4683-b288-4b28-9b9b-504461c55a4e-kube-api-access-lpxkb\") pod \"barbican-f734-account-create-update-wkhs4\" (UID: \"bf9e4683-b288-4b28-9b9b-504461c55a4e\") " pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.021343 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4r9l" event={"ID":"30d15a7c-e3f9-4280-b0d3-39264c464abe","Type":"ContainerStarted","Data":"aaaf66b396de81920559109560059a25ed5b958b5791039a68fba6ae2b9ed583"} Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.052742 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b4r9l" podStartSLOduration=2.624408368 podStartE2EDuration="5.052714321s" podCreationTimestamp="2026-01-31 09:16:11 +0000 UTC" firstStartedPulling="2026-01-31 09:16:12.992971039 +0000 UTC m=+911.298847243" lastFinishedPulling="2026-01-31 09:16:15.421276992 +0000 UTC m=+913.727153196" observedRunningTime="2026-01-31 09:16:16.047524737 +0000 UTC m=+914.353400941" watchObservedRunningTime="2026-01-31 09:16:16.052714321 +0000 UTC m=+914.358590535" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.094842 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb9kr\" (UniqueName: \"kubernetes.io/projected/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-kube-api-access-pb9kr\") pod \"barbican-db-create-hffnv\" (UID: \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\") " pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.094920 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpxkb\" (UniqueName: \"kubernetes.io/projected/bf9e4683-b288-4b28-9b9b-504461c55a4e-kube-api-access-lpxkb\") pod \"barbican-f734-account-create-update-wkhs4\" (UID: \"bf9e4683-b288-4b28-9b9b-504461c55a4e\") " pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.094987 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-operator-scripts\") pod \"barbican-db-create-hffnv\" (UID: \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\") " pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.095111 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf9e4683-b288-4b28-9b9b-504461c55a4e-operator-scripts\") pod \"barbican-f734-account-create-update-wkhs4\" (UID: \"bf9e4683-b288-4b28-9b9b-504461c55a4e\") " pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.095926 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-operator-scripts\") pod \"barbican-db-create-hffnv\" (UID: \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\") " pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.096053 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf9e4683-b288-4b28-9b9b-504461c55a4e-operator-scripts\") pod \"barbican-f734-account-create-update-wkhs4\" (UID: \"bf9e4683-b288-4b28-9b9b-504461c55a4e\") " pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.115388 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpxkb\" (UniqueName: \"kubernetes.io/projected/bf9e4683-b288-4b28-9b9b-504461c55a4e-kube-api-access-lpxkb\") pod \"barbican-f734-account-create-update-wkhs4\" (UID: \"bf9e4683-b288-4b28-9b9b-504461c55a4e\") " pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.121878 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb9kr\" (UniqueName: \"kubernetes.io/projected/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-kube-api-access-pb9kr\") pod \"barbican-db-create-hffnv\" (UID: \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\") " pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.153809 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.161972 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.333390 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.506547 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-fernet-keys\") pod \"65a47d3d-88f0-48b4-b672-9b224ead785f\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.506592 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-scripts\") pod \"65a47d3d-88f0-48b4-b672-9b224ead785f\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.506611 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-config-data\") pod \"65a47d3d-88f0-48b4-b672-9b224ead785f\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.506699 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-credential-keys\") pod \"65a47d3d-88f0-48b4-b672-9b224ead785f\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.506735 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7dm2\" (UniqueName: \"kubernetes.io/projected/65a47d3d-88f0-48b4-b672-9b224ead785f-kube-api-access-s7dm2\") pod \"65a47d3d-88f0-48b4-b672-9b224ead785f\" (UID: \"65a47d3d-88f0-48b4-b672-9b224ead785f\") " Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.510991 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a47d3d-88f0-48b4-b672-9b224ead785f-kube-api-access-s7dm2" (OuterVolumeSpecName: "kube-api-access-s7dm2") pod "65a47d3d-88f0-48b4-b672-9b224ead785f" (UID: "65a47d3d-88f0-48b4-b672-9b224ead785f"). InnerVolumeSpecName "kube-api-access-s7dm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.511473 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-scripts" (OuterVolumeSpecName: "scripts") pod "65a47d3d-88f0-48b4-b672-9b224ead785f" (UID: "65a47d3d-88f0-48b4-b672-9b224ead785f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.511590 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "65a47d3d-88f0-48b4-b672-9b224ead785f" (UID: "65a47d3d-88f0-48b4-b672-9b224ead785f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.511780 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "65a47d3d-88f0-48b4-b672-9b224ead785f" (UID: "65a47d3d-88f0-48b4-b672-9b224ead785f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.528258 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-config-data" (OuterVolumeSpecName: "config-data") pod "65a47d3d-88f0-48b4-b672-9b224ead785f" (UID: "65a47d3d-88f0-48b4-b672-9b224ead785f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.608374 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.608422 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.608434 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.608447 4732 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65a47d3d-88f0-48b4-b672-9b224ead785f-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.608462 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7dm2\" (UniqueName: \"kubernetes.io/projected/65a47d3d-88f0-48b4-b672-9b224ead785f-kube-api-access-s7dm2\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:16 crc kubenswrapper[4732]: W0131 09:16:16.629962 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d54d8c7_230a_4831_97a8_d17aef7fa6eb.slice/crio-ba499657f2f30c6dcb7e8a9c8ad38f6d7bf22c0624c460b0a91dbca9a6242663 WatchSource:0}: Error finding container ba499657f2f30c6dcb7e8a9c8ad38f6d7bf22c0624c460b0a91dbca9a6242663: Status 404 returned error can't find the container with id ba499657f2f30c6dcb7e8a9c8ad38f6d7bf22c0624c460b0a91dbca9a6242663 Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.632140 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-create-hffnv"] Jan 31 09:16:16 crc kubenswrapper[4732]: I0131 09:16:16.690306 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-f734-account-create-update-wkhs4"] Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.032486 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-hffnv" event={"ID":"6d54d8c7-230a-4831-97a8-d17aef7fa6eb","Type":"ContainerStarted","Data":"fe93f24b1b1ca8fe4c671105ac0fbeb376225843e31d3614ced260876fce3216"} Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.032535 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-hffnv" event={"ID":"6d54d8c7-230a-4831-97a8-d17aef7fa6eb","Type":"ContainerStarted","Data":"ba499657f2f30c6dcb7e8a9c8ad38f6d7bf22c0624c460b0a91dbca9a6242663"} Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.035372 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" event={"ID":"bf9e4683-b288-4b28-9b9b-504461c55a4e","Type":"ContainerStarted","Data":"8c02dfae66fd3a4f5b4725177171b29a50b8756093929474b400347824b72530"} Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.035401 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" event={"ID":"bf9e4683-b288-4b28-9b9b-504461c55a4e","Type":"ContainerStarted","Data":"3022480a202c2c38f4b2f13a0e2f0a94f0e4c039b5760a147d6502f6f41d8cef"} Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.037717 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-r757b" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.037746 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-r757b" event={"ID":"65a47d3d-88f0-48b4-b672-9b224ead785f","Type":"ContainerDied","Data":"2972059b4750f41cbe4b42855dfe50fb8a59b3da0d14b490d2a8c37cd4ca4736"} Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.037761 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2972059b4750f41cbe4b42855dfe50fb8a59b3da0d14b490d2a8c37cd4ca4736" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.063370 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-db-create-hffnv" podStartSLOduration=2.063349338 podStartE2EDuration="2.063349338s" podCreationTimestamp="2026-01-31 09:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:16:17.061068396 +0000 UTC m=+915.366944610" watchObservedRunningTime="2026-01-31 09:16:17.063349338 +0000 UTC m=+915.369225542" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.081448 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" podStartSLOduration=2.08143095 podStartE2EDuration="2.08143095s" podCreationTimestamp="2026-01-31 09:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:16:17.078214538 +0000 UTC m=+915.384090742" watchObservedRunningTime="2026-01-31 09:16:17.08143095 +0000 UTC m=+915.387307154" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.183927 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-7959cb4f8b-4vxsg"] Jan 31 09:16:17 crc kubenswrapper[4732]: E0131 09:16:17.184238 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a47d3d-88f0-48b4-b672-9b224ead785f" containerName="keystone-bootstrap" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.184263 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a47d3d-88f0-48b4-b672-9b224ead785f" containerName="keystone-bootstrap" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.184439 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a47d3d-88f0-48b4-b672-9b224ead785f" containerName="keystone-bootstrap" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.185035 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.187622 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.187818 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-2sjhb" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.187929 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.189381 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.197703 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-7959cb4f8b-4vxsg"] Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.329593 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-config-data\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.329649 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-fernet-keys\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.329691 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-scripts\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.329954 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsj4z\" (UniqueName: \"kubernetes.io/projected/1ee84530-efd7-4d83-9aa2-fb9b8b178496-kube-api-access-jsj4z\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.330020 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-credential-keys\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.430945 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsj4z\" (UniqueName: \"kubernetes.io/projected/1ee84530-efd7-4d83-9aa2-fb9b8b178496-kube-api-access-jsj4z\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.431005 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-credential-keys\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.431052 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-config-data\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.431094 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-fernet-keys\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.431126 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-scripts\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.436051 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-scripts\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.436374 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-fernet-keys\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.436417 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-config-data\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.438225 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-credential-keys\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.450385 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsj4z\" (UniqueName: \"kubernetes.io/projected/1ee84530-efd7-4d83-9aa2-fb9b8b178496-kube-api-access-jsj4z\") pod \"keystone-7959cb4f8b-4vxsg\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.497683 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.497978 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.504703 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:17 crc kubenswrapper[4732]: I0131 09:16:17.941221 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-7959cb4f8b-4vxsg"] Jan 31 09:16:18 crc kubenswrapper[4732]: I0131 09:16:18.050121 4732 generic.go:334] "Generic (PLEG): container finished" podID="6d54d8c7-230a-4831-97a8-d17aef7fa6eb" containerID="fe93f24b1b1ca8fe4c671105ac0fbeb376225843e31d3614ced260876fce3216" exitCode=0 Jan 31 09:16:18 crc kubenswrapper[4732]: I0131 09:16:18.050219 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-hffnv" event={"ID":"6d54d8c7-230a-4831-97a8-d17aef7fa6eb","Type":"ContainerDied","Data":"fe93f24b1b1ca8fe4c671105ac0fbeb376225843e31d3614ced260876fce3216"} Jan 31 09:16:18 crc kubenswrapper[4732]: I0131 09:16:18.052755 4732 generic.go:334] "Generic (PLEG): container finished" podID="bf9e4683-b288-4b28-9b9b-504461c55a4e" containerID="8c02dfae66fd3a4f5b4725177171b29a50b8756093929474b400347824b72530" exitCode=0 Jan 31 09:16:18 crc kubenswrapper[4732]: I0131 09:16:18.052803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" event={"ID":"bf9e4683-b288-4b28-9b9b-504461c55a4e","Type":"ContainerDied","Data":"8c02dfae66fd3a4f5b4725177171b29a50b8756093929474b400347824b72530"} Jan 31 09:16:18 crc kubenswrapper[4732]: I0131 09:16:18.054206 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" event={"ID":"1ee84530-efd7-4d83-9aa2-fb9b8b178496","Type":"ContainerStarted","Data":"dab5aaa05d738eb5bfa3b09e12be7ced9a61a97b3de7389937699c76857d4ec7"} Jan 31 09:16:18 crc kubenswrapper[4732]: I0131 09:16:18.654898 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qdc6"] Jan 31 09:16:18 crc kubenswrapper[4732]: I0131 09:16:18.655452 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6qdc6" podUID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerName="registry-server" containerID="cri-o://ee0c5fb66ad6b1e6b4f42d6ec9cbbd6de131ce1fe2d36272c309841e4eaa9b41" gracePeriod=2 Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.061512 4732 generic.go:334] "Generic (PLEG): container finished" podID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerID="ee0c5fb66ad6b1e6b4f42d6ec9cbbd6de131ce1fe2d36272c309841e4eaa9b41" exitCode=0 Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.061561 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qdc6" event={"ID":"5d0de300-63ee-4f40-9750-7eb7d5d10466","Type":"ContainerDied","Data":"ee0c5fb66ad6b1e6b4f42d6ec9cbbd6de131ce1fe2d36272c309841e4eaa9b41"} Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.061584 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6qdc6" event={"ID":"5d0de300-63ee-4f40-9750-7eb7d5d10466","Type":"ContainerDied","Data":"a14dfd56c323ab432efc86e11648711e57a20ae7c23a10d2f26d38d578ad14dc"} Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.061595 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a14dfd56c323ab432efc86e11648711e57a20ae7c23a10d2f26d38d578ad14dc" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.064244 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" event={"ID":"1ee84530-efd7-4d83-9aa2-fb9b8b178496","Type":"ContainerStarted","Data":"2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932"} Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.064346 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.078369 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.109224 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" podStartSLOduration=2.10916686 podStartE2EDuration="2.10916686s" podCreationTimestamp="2026-01-31 09:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:16:19.089058164 +0000 UTC m=+917.394934368" watchObservedRunningTime="2026-01-31 09:16:19.10916686 +0000 UTC m=+917.415043064" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.154356 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-utilities\") pod \"5d0de300-63ee-4f40-9750-7eb7d5d10466\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.154789 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-catalog-content\") pod \"5d0de300-63ee-4f40-9750-7eb7d5d10466\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.154816 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7wmh\" (UniqueName: \"kubernetes.io/projected/5d0de300-63ee-4f40-9750-7eb7d5d10466-kube-api-access-n7wmh\") pod \"5d0de300-63ee-4f40-9750-7eb7d5d10466\" (UID: \"5d0de300-63ee-4f40-9750-7eb7d5d10466\") " Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.155228 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-utilities" (OuterVolumeSpecName: "utilities") pod "5d0de300-63ee-4f40-9750-7eb7d5d10466" (UID: "5d0de300-63ee-4f40-9750-7eb7d5d10466"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.160642 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0de300-63ee-4f40-9750-7eb7d5d10466-kube-api-access-n7wmh" (OuterVolumeSpecName: "kube-api-access-n7wmh") pod "5d0de300-63ee-4f40-9750-7eb7d5d10466" (UID: "5d0de300-63ee-4f40-9750-7eb7d5d10466"). InnerVolumeSpecName "kube-api-access-n7wmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.213917 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d0de300-63ee-4f40-9750-7eb7d5d10466" (UID: "5d0de300-63ee-4f40-9750-7eb7d5d10466"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.256776 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.256813 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7wmh\" (UniqueName: \"kubernetes.io/projected/5d0de300-63ee-4f40-9750-7eb7d5d10466-kube-api-access-n7wmh\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.256829 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d0de300-63ee-4f40-9750-7eb7d5d10466-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.385145 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.426566 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.459731 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf9e4683-b288-4b28-9b9b-504461c55a4e-operator-scripts\") pod \"bf9e4683-b288-4b28-9b9b-504461c55a4e\" (UID: \"bf9e4683-b288-4b28-9b9b-504461c55a4e\") " Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.459818 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpxkb\" (UniqueName: \"kubernetes.io/projected/bf9e4683-b288-4b28-9b9b-504461c55a4e-kube-api-access-lpxkb\") pod \"bf9e4683-b288-4b28-9b9b-504461c55a4e\" (UID: \"bf9e4683-b288-4b28-9b9b-504461c55a4e\") " Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.460396 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf9e4683-b288-4b28-9b9b-504461c55a4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf9e4683-b288-4b28-9b9b-504461c55a4e" (UID: "bf9e4683-b288-4b28-9b9b-504461c55a4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.465021 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9e4683-b288-4b28-9b9b-504461c55a4e-kube-api-access-lpxkb" (OuterVolumeSpecName: "kube-api-access-lpxkb") pod "bf9e4683-b288-4b28-9b9b-504461c55a4e" (UID: "bf9e4683-b288-4b28-9b9b-504461c55a4e"). InnerVolumeSpecName "kube-api-access-lpxkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.561142 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-operator-scripts\") pod \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\" (UID: \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\") " Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.561259 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb9kr\" (UniqueName: \"kubernetes.io/projected/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-kube-api-access-pb9kr\") pod \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\" (UID: \"6d54d8c7-230a-4831-97a8-d17aef7fa6eb\") " Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.561615 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf9e4683-b288-4b28-9b9b-504461c55a4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.561644 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpxkb\" (UniqueName: \"kubernetes.io/projected/bf9e4683-b288-4b28-9b9b-504461c55a4e-kube-api-access-lpxkb\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.561637 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d54d8c7-230a-4831-97a8-d17aef7fa6eb" (UID: "6d54d8c7-230a-4831-97a8-d17aef7fa6eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.564594 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-kube-api-access-pb9kr" (OuterVolumeSpecName: "kube-api-access-pb9kr") pod "6d54d8c7-230a-4831-97a8-d17aef7fa6eb" (UID: "6d54d8c7-230a-4831-97a8-d17aef7fa6eb"). InnerVolumeSpecName "kube-api-access-pb9kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.662412 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.662443 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb9kr\" (UniqueName: \"kubernetes.io/projected/6d54d8c7-230a-4831-97a8-d17aef7fa6eb-kube-api-access-pb9kr\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.849137 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-zhkcf"] Jan 31 09:16:19 crc kubenswrapper[4732]: E0131 09:16:19.849422 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d54d8c7-230a-4831-97a8-d17aef7fa6eb" containerName="mariadb-database-create" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.849438 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d54d8c7-230a-4831-97a8-d17aef7fa6eb" containerName="mariadb-database-create" Jan 31 09:16:19 crc kubenswrapper[4732]: E0131 09:16:19.849454 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerName="registry-server" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.849461 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerName="registry-server" Jan 31 09:16:19 crc kubenswrapper[4732]: E0131 09:16:19.849474 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerName="extract-content" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.849482 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerName="extract-content" Jan 31 09:16:19 crc kubenswrapper[4732]: E0131 09:16:19.849492 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9e4683-b288-4b28-9b9b-504461c55a4e" containerName="mariadb-account-create-update" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.849499 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9e4683-b288-4b28-9b9b-504461c55a4e" containerName="mariadb-account-create-update" Jan 31 09:16:19 crc kubenswrapper[4732]: E0131 09:16:19.849519 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerName="extract-utilities" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.849524 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerName="extract-utilities" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.849651 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d54d8c7-230a-4831-97a8-d17aef7fa6eb" containerName="mariadb-database-create" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.849681 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9e4683-b288-4b28-9b9b-504461c55a4e" containerName="mariadb-account-create-update" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.849692 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0de300-63ee-4f40-9750-7eb7d5d10466" containerName="registry-server" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.850101 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.852531 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-lc87l" Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.903706 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-zhkcf"] Jan 31 09:16:19 crc kubenswrapper[4732]: I0131 09:16:19.974365 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59577\" (UniqueName: \"kubernetes.io/projected/24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf-kube-api-access-59577\") pod \"swift-operator-index-zhkcf\" (UID: \"24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf\") " pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.071948 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-hffnv" event={"ID":"6d54d8c7-230a-4831-97a8-d17aef7fa6eb","Type":"ContainerDied","Data":"ba499657f2f30c6dcb7e8a9c8ad38f6d7bf22c0624c460b0a91dbca9a6242663"} Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.071980 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-hffnv" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.071987 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba499657f2f30c6dcb7e8a9c8ad38f6d7bf22c0624c460b0a91dbca9a6242663" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.073421 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" event={"ID":"bf9e4683-b288-4b28-9b9b-504461c55a4e","Type":"ContainerDied","Data":"3022480a202c2c38f4b2f13a0e2f0a94f0e4c039b5760a147d6502f6f41d8cef"} Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.073459 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6qdc6" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.073471 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3022480a202c2c38f4b2f13a0e2f0a94f0e4c039b5760a147d6502f6f41d8cef" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.073474 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-f734-account-create-update-wkhs4" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.076097 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59577\" (UniqueName: \"kubernetes.io/projected/24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf-kube-api-access-59577\") pod \"swift-operator-index-zhkcf\" (UID: \"24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf\") " pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.116735 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59577\" (UniqueName: \"kubernetes.io/projected/24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf-kube-api-access-59577\") pod \"swift-operator-index-zhkcf\" (UID: \"24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf\") " pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.118700 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6qdc6"] Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.128698 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6qdc6"] Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.185809 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.551246 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0de300-63ee-4f40-9750-7eb7d5d10466" path="/var/lib/kubelet/pods/5d0de300-63ee-4f40-9750-7eb7d5d10466/volumes" Jan 31 09:16:20 crc kubenswrapper[4732]: I0131 09:16:20.593037 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-zhkcf"] Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.083840 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-zhkcf" event={"ID":"24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf","Type":"ContainerStarted","Data":"8514ba99bedbc8f5d369f906a000fdaa79e2f95c8cdc60f9e5782dca2dcdc8ab"} Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.219225 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-db-sync-482fl"] Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.220321 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.229504 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-barbican-dockercfg-vshjt" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.229798 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-config-data" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.230277 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-482fl"] Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.294557 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52cae0eb-413d-4365-a717-8039a3e3b99f-db-sync-config-data\") pod \"barbican-db-sync-482fl\" (UID: \"52cae0eb-413d-4365-a717-8039a3e3b99f\") " pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.295022 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-258tr\" (UniqueName: \"kubernetes.io/projected/52cae0eb-413d-4365-a717-8039a3e3b99f-kube-api-access-258tr\") pod \"barbican-db-sync-482fl\" (UID: \"52cae0eb-413d-4365-a717-8039a3e3b99f\") " pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.396156 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52cae0eb-413d-4365-a717-8039a3e3b99f-db-sync-config-data\") pod \"barbican-db-sync-482fl\" (UID: \"52cae0eb-413d-4365-a717-8039a3e3b99f\") " pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.396843 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-258tr\" (UniqueName: \"kubernetes.io/projected/52cae0eb-413d-4365-a717-8039a3e3b99f-kube-api-access-258tr\") pod \"barbican-db-sync-482fl\" (UID: \"52cae0eb-413d-4365-a717-8039a3e3b99f\") " pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.401888 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52cae0eb-413d-4365-a717-8039a3e3b99f-db-sync-config-data\") pod \"barbican-db-sync-482fl\" (UID: \"52cae0eb-413d-4365-a717-8039a3e3b99f\") " pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.415925 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-258tr\" (UniqueName: \"kubernetes.io/projected/52cae0eb-413d-4365-a717-8039a3e3b99f-kube-api-access-258tr\") pod \"barbican-db-sync-482fl\" (UID: \"52cae0eb-413d-4365-a717-8039a3e3b99f\") " pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.548263 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.967916 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:21 crc kubenswrapper[4732]: I0131 09:16:21.967964 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:22 crc kubenswrapper[4732]: I0131 09:16:22.015310 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:22 crc kubenswrapper[4732]: I0131 09:16:22.127391 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:22 crc kubenswrapper[4732]: I0131 09:16:22.654584 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-482fl"] Jan 31 09:16:22 crc kubenswrapper[4732]: W0131 09:16:22.944705 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52cae0eb_413d_4365_a717_8039a3e3b99f.slice/crio-fb6ccbaa8edb2b56ccc907d46d09438babcaeb68d1e54e31b63b92be8948762d WatchSource:0}: Error finding container fb6ccbaa8edb2b56ccc907d46d09438babcaeb68d1e54e31b63b92be8948762d: Status 404 returned error can't find the container with id fb6ccbaa8edb2b56ccc907d46d09438babcaeb68d1e54e31b63b92be8948762d Jan 31 09:16:23 crc kubenswrapper[4732]: I0131 09:16:23.101254 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-482fl" event={"ID":"52cae0eb-413d-4365-a717-8039a3e3b99f","Type":"ContainerStarted","Data":"fb6ccbaa8edb2b56ccc907d46d09438babcaeb68d1e54e31b63b92be8948762d"} Jan 31 09:16:23 crc kubenswrapper[4732]: I0131 09:16:23.437320 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b4r9l"] Jan 31 09:16:24 crc kubenswrapper[4732]: I0131 09:16:24.111928 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-zhkcf" event={"ID":"24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf","Type":"ContainerStarted","Data":"27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f"} Jan 31 09:16:24 crc kubenswrapper[4732]: I0131 09:16:24.133170 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-zhkcf" podStartSLOduration=2.735526209 podStartE2EDuration="5.133138771s" podCreationTimestamp="2026-01-31 09:16:19 +0000 UTC" firstStartedPulling="2026-01-31 09:16:20.596597653 +0000 UTC m=+918.902473867" lastFinishedPulling="2026-01-31 09:16:22.994210225 +0000 UTC m=+921.300086429" observedRunningTime="2026-01-31 09:16:24.126206431 +0000 UTC m=+922.432082635" watchObservedRunningTime="2026-01-31 09:16:24.133138771 +0000 UTC m=+922.439015015" Jan 31 09:16:25 crc kubenswrapper[4732]: I0131 09:16:25.118054 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b4r9l" podUID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerName="registry-server" containerID="cri-o://aaaf66b396de81920559109560059a25ed5b958b5791039a68fba6ae2b9ed583" gracePeriod=2 Jan 31 09:16:26 crc kubenswrapper[4732]: I0131 09:16:26.126199 4732 generic.go:334] "Generic (PLEG): container finished" podID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerID="aaaf66b396de81920559109560059a25ed5b958b5791039a68fba6ae2b9ed583" exitCode=0 Jan 31 09:16:26 crc kubenswrapper[4732]: I0131 09:16:26.126280 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4r9l" event={"ID":"30d15a7c-e3f9-4280-b0d3-39264c464abe","Type":"ContainerDied","Data":"aaaf66b396de81920559109560059a25ed5b958b5791039a68fba6ae2b9ed583"} Jan 31 09:16:26 crc kubenswrapper[4732]: I0131 09:16:26.873912 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:26 crc kubenswrapper[4732]: I0131 09:16:26.979043 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-utilities\") pod \"30d15a7c-e3f9-4280-b0d3-39264c464abe\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " Jan 31 09:16:26 crc kubenswrapper[4732]: I0131 09:16:26.979128 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-catalog-content\") pod \"30d15a7c-e3f9-4280-b0d3-39264c464abe\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " Jan 31 09:16:26 crc kubenswrapper[4732]: I0131 09:16:26.979296 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd27r\" (UniqueName: \"kubernetes.io/projected/30d15a7c-e3f9-4280-b0d3-39264c464abe-kube-api-access-dd27r\") pod \"30d15a7c-e3f9-4280-b0d3-39264c464abe\" (UID: \"30d15a7c-e3f9-4280-b0d3-39264c464abe\") " Jan 31 09:16:26 crc kubenswrapper[4732]: I0131 09:16:26.980459 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-utilities" (OuterVolumeSpecName: "utilities") pod "30d15a7c-e3f9-4280-b0d3-39264c464abe" (UID: "30d15a7c-e3f9-4280-b0d3-39264c464abe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:16:26 crc kubenswrapper[4732]: I0131 09:16:26.985785 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d15a7c-e3f9-4280-b0d3-39264c464abe-kube-api-access-dd27r" (OuterVolumeSpecName: "kube-api-access-dd27r") pod "30d15a7c-e3f9-4280-b0d3-39264c464abe" (UID: "30d15a7c-e3f9-4280-b0d3-39264c464abe"). InnerVolumeSpecName "kube-api-access-dd27r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.034934 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30d15a7c-e3f9-4280-b0d3-39264c464abe" (UID: "30d15a7c-e3f9-4280-b0d3-39264c464abe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.080594 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.080628 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30d15a7c-e3f9-4280-b0d3-39264c464abe-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.080642 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd27r\" (UniqueName: \"kubernetes.io/projected/30d15a7c-e3f9-4280-b0d3-39264c464abe-kube-api-access-dd27r\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.139425 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b4r9l" event={"ID":"30d15a7c-e3f9-4280-b0d3-39264c464abe","Type":"ContainerDied","Data":"d41d176201ea47fe3677b834f0031fb82b32fbeaf46af88d3969bcb67324887b"} Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.139445 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b4r9l" Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.139494 4732 scope.go:117] "RemoveContainer" containerID="aaaf66b396de81920559109560059a25ed5b958b5791039a68fba6ae2b9ed583" Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.157073 4732 scope.go:117] "RemoveContainer" containerID="f8fa3bcb0182e184d81f9b95c68a6ba376603e25fdaca53ae4f6c19daf283637" Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.177420 4732 scope.go:117] "RemoveContainer" containerID="cc7bdcfd3c4cf42fbbd1013cc743457dedaafba9028f44513b712048f986520e" Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.178788 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b4r9l"] Jan 31 09:16:27 crc kubenswrapper[4732]: I0131 09:16:27.184252 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b4r9l"] Jan 31 09:16:28 crc kubenswrapper[4732]: I0131 09:16:28.150455 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-482fl" event={"ID":"52cae0eb-413d-4365-a717-8039a3e3b99f","Type":"ContainerStarted","Data":"fecab232c8eea055dcb26c6531c6bec4d55c835988b0fcd4b2a823ee72e633d6"} Jan 31 09:16:28 crc kubenswrapper[4732]: I0131 09:16:28.552462 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d15a7c-e3f9-4280-b0d3-39264c464abe" path="/var/lib/kubelet/pods/30d15a7c-e3f9-4280-b0d3-39264c464abe/volumes" Jan 31 09:16:30 crc kubenswrapper[4732]: I0131 09:16:30.163495 4732 generic.go:334] "Generic (PLEG): container finished" podID="52cae0eb-413d-4365-a717-8039a3e3b99f" containerID="fecab232c8eea055dcb26c6531c6bec4d55c835988b0fcd4b2a823ee72e633d6" exitCode=0 Jan 31 09:16:30 crc kubenswrapper[4732]: I0131 09:16:30.163540 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-482fl" event={"ID":"52cae0eb-413d-4365-a717-8039a3e3b99f","Type":"ContainerDied","Data":"fecab232c8eea055dcb26c6531c6bec4d55c835988b0fcd4b2a823ee72e633d6"} Jan 31 09:16:30 crc kubenswrapper[4732]: I0131 09:16:30.223550 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:16:30 crc kubenswrapper[4732]: I0131 09:16:30.223978 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:16:30 crc kubenswrapper[4732]: I0131 09:16:30.266016 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:16:31 crc kubenswrapper[4732]: I0131 09:16:31.194038 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:16:31 crc kubenswrapper[4732]: I0131 09:16:31.490280 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:31 crc kubenswrapper[4732]: I0131 09:16:31.571262 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52cae0eb-413d-4365-a717-8039a3e3b99f-db-sync-config-data\") pod \"52cae0eb-413d-4365-a717-8039a3e3b99f\" (UID: \"52cae0eb-413d-4365-a717-8039a3e3b99f\") " Jan 31 09:16:31 crc kubenswrapper[4732]: I0131 09:16:31.571318 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-258tr\" (UniqueName: \"kubernetes.io/projected/52cae0eb-413d-4365-a717-8039a3e3b99f-kube-api-access-258tr\") pod \"52cae0eb-413d-4365-a717-8039a3e3b99f\" (UID: \"52cae0eb-413d-4365-a717-8039a3e3b99f\") " Jan 31 09:16:31 crc kubenswrapper[4732]: I0131 09:16:31.578692 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cae0eb-413d-4365-a717-8039a3e3b99f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "52cae0eb-413d-4365-a717-8039a3e3b99f" (UID: "52cae0eb-413d-4365-a717-8039a3e3b99f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:16:31 crc kubenswrapper[4732]: I0131 09:16:31.578980 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cae0eb-413d-4365-a717-8039a3e3b99f-kube-api-access-258tr" (OuterVolumeSpecName: "kube-api-access-258tr") pod "52cae0eb-413d-4365-a717-8039a3e3b99f" (UID: "52cae0eb-413d-4365-a717-8039a3e3b99f"). InnerVolumeSpecName "kube-api-access-258tr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:16:31 crc kubenswrapper[4732]: I0131 09:16:31.673613 4732 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/52cae0eb-413d-4365-a717-8039a3e3b99f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:31 crc kubenswrapper[4732]: I0131 09:16:31.673673 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-258tr\" (UniqueName: \"kubernetes.io/projected/52cae0eb-413d-4365-a717-8039a3e3b99f-kube-api-access-258tr\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.179102 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-482fl" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.179079 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-482fl" event={"ID":"52cae0eb-413d-4365-a717-8039a3e3b99f","Type":"ContainerDied","Data":"fb6ccbaa8edb2b56ccc907d46d09438babcaeb68d1e54e31b63b92be8948762d"} Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.179162 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb6ccbaa8edb2b56ccc907d46d09438babcaeb68d1e54e31b63b92be8948762d" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.501735 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-worker-597f49d4f-8nh24"] Jan 31 09:16:32 crc kubenswrapper[4732]: E0131 09:16:32.502386 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cae0eb-413d-4365-a717-8039a3e3b99f" containerName="barbican-db-sync" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.502400 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cae0eb-413d-4365-a717-8039a3e3b99f" containerName="barbican-db-sync" Jan 31 09:16:32 crc kubenswrapper[4732]: E0131 09:16:32.502417 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerName="registry-server" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.502423 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerName="registry-server" Jan 31 09:16:32 crc kubenswrapper[4732]: E0131 09:16:32.502432 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerName="extract-content" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.502438 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerName="extract-content" Jan 31 09:16:32 crc kubenswrapper[4732]: E0131 09:16:32.502455 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerName="extract-utilities" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.502461 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerName="extract-utilities" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.502574 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="52cae0eb-413d-4365-a717-8039a3e3b99f" containerName="barbican-db-sync" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.502584 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d15a7c-e3f9-4280-b0d3-39264c464abe" containerName="registry-server" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.503373 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.508569 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-barbican-dockercfg-vshjt" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.508751 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-config-data" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.508784 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-worker-config-data" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.509617 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8"] Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.510934 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.512915 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-keystone-listener-config-data" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.516899 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-worker-597f49d4f-8nh24"] Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.530812 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8"] Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.585476 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.585538 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bkhk\" (UniqueName: \"kubernetes.io/projected/7728f3b2-7258-444d-982b-10d416bb61f0-kube-api-access-6bkhk\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.585557 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3b1cc40-9985-45d8-bb06-0676ff188c6c-logs\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.585590 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.585620 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data-custom\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.585641 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c684\" (UniqueName: \"kubernetes.io/projected/d3b1cc40-9985-45d8-bb06-0676ff188c6c-kube-api-access-5c684\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.585690 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data-custom\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.585722 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7728f3b2-7258-444d-982b-10d416bb61f0-logs\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.638815 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq"] Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.639787 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.641584 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-api-config-data" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.650643 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq"] Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687122 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7728f3b2-7258-444d-982b-10d416bb61f0-logs\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687202 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687244 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bkhk\" (UniqueName: \"kubernetes.io/projected/7728f3b2-7258-444d-982b-10d416bb61f0-kube-api-access-6bkhk\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687271 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-logs\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687293 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3b1cc40-9985-45d8-bb06-0676ff188c6c-logs\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687319 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687354 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687388 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data-custom\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687417 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c684\" (UniqueName: \"kubernetes.io/projected/d3b1cc40-9985-45d8-bb06-0676ff188c6c-kube-api-access-5c684\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687453 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lp6p\" (UniqueName: \"kubernetes.io/projected/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-kube-api-access-5lp6p\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687474 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data-custom\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687537 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data-custom\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.687876 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3b1cc40-9985-45d8-bb06-0676ff188c6c-logs\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.688203 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7728f3b2-7258-444d-982b-10d416bb61f0-logs\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.692189 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data-custom\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.692433 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.693822 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.700325 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data-custom\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.703115 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bkhk\" (UniqueName: \"kubernetes.io/projected/7728f3b2-7258-444d-982b-10d416bb61f0-kube-api-access-6bkhk\") pod \"barbican-keystone-listener-5f5b7fdb46-c8wb8\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.712599 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c684\" (UniqueName: \"kubernetes.io/projected/d3b1cc40-9985-45d8-bb06-0676ff188c6c-kube-api-access-5c684\") pod \"barbican-worker-597f49d4f-8nh24\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.788403 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-logs\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.788465 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.788546 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lp6p\" (UniqueName: \"kubernetes.io/projected/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-kube-api-access-5lp6p\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.788577 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data-custom\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.789459 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-logs\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.794843 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data-custom\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.800976 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.820602 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lp6p\" (UniqueName: \"kubernetes.io/projected/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-kube-api-access-5lp6p\") pod \"barbican-api-5bb4486f46-rmsgq\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.824357 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.833681 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:16:32 crc kubenswrapper[4732]: I0131 09:16:32.959991 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:33 crc kubenswrapper[4732]: I0131 09:16:33.259718 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-worker-597f49d4f-8nh24"] Jan 31 09:16:33 crc kubenswrapper[4732]: W0131 09:16:33.263184 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3b1cc40_9985_45d8_bb06_0676ff188c6c.slice/crio-349666c65ccee57c71b989263a2d6e9c2979da105d5b7e91fe8eeec62ef91646 WatchSource:0}: Error finding container 349666c65ccee57c71b989263a2d6e9c2979da105d5b7e91fe8eeec62ef91646: Status 404 returned error can't find the container with id 349666c65ccee57c71b989263a2d6e9c2979da105d5b7e91fe8eeec62ef91646 Jan 31 09:16:33 crc kubenswrapper[4732]: I0131 09:16:33.340259 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8"] Jan 31 09:16:33 crc kubenswrapper[4732]: W0131 09:16:33.347202 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7728f3b2_7258_444d_982b_10d416bb61f0.slice/crio-5117dba79725a077cd6ab2eb2327b8a7f198da1d9f337e252448199e446feb3a WatchSource:0}: Error finding container 5117dba79725a077cd6ab2eb2327b8a7f198da1d9f337e252448199e446feb3a: Status 404 returned error can't find the container with id 5117dba79725a077cd6ab2eb2327b8a7f198da1d9f337e252448199e446feb3a Jan 31 09:16:33 crc kubenswrapper[4732]: I0131 09:16:33.440027 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq"] Jan 31 09:16:34 crc kubenswrapper[4732]: I0131 09:16:34.209222 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" event={"ID":"4981d9a9-898f-49ff-809d-58c7ca3bd2a3","Type":"ContainerStarted","Data":"341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8"} Jan 31 09:16:34 crc kubenswrapper[4732]: I0131 09:16:34.209802 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" event={"ID":"4981d9a9-898f-49ff-809d-58c7ca3bd2a3","Type":"ContainerStarted","Data":"2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1"} Jan 31 09:16:34 crc kubenswrapper[4732]: I0131 09:16:34.209837 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:34 crc kubenswrapper[4732]: I0131 09:16:34.209847 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" event={"ID":"4981d9a9-898f-49ff-809d-58c7ca3bd2a3","Type":"ContainerStarted","Data":"dc8a708608424d3770f138159a52c830a7b371ad68ddd44083bf847d118f3337"} Jan 31 09:16:34 crc kubenswrapper[4732]: I0131 09:16:34.209858 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:34 crc kubenswrapper[4732]: I0131 09:16:34.210301 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" event={"ID":"d3b1cc40-9985-45d8-bb06-0676ff188c6c","Type":"ContainerStarted","Data":"349666c65ccee57c71b989263a2d6e9c2979da105d5b7e91fe8eeec62ef91646"} Jan 31 09:16:34 crc kubenswrapper[4732]: I0131 09:16:34.212933 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" event={"ID":"7728f3b2-7258-444d-982b-10d416bb61f0","Type":"ContainerStarted","Data":"5117dba79725a077cd6ab2eb2327b8a7f198da1d9f337e252448199e446feb3a"} Jan 31 09:16:34 crc kubenswrapper[4732]: I0131 09:16:34.232290 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" podStartSLOduration=2.23227087 podStartE2EDuration="2.23227087s" podCreationTimestamp="2026-01-31 09:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:16:34.228839533 +0000 UTC m=+932.534715787" watchObservedRunningTime="2026-01-31 09:16:34.23227087 +0000 UTC m=+932.538147094" Jan 31 09:16:35 crc kubenswrapper[4732]: I0131 09:16:35.226763 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" event={"ID":"d3b1cc40-9985-45d8-bb06-0676ff188c6c","Type":"ContainerStarted","Data":"61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962"} Jan 31 09:16:35 crc kubenswrapper[4732]: I0131 09:16:35.228544 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" event={"ID":"7728f3b2-7258-444d-982b-10d416bb61f0","Type":"ContainerStarted","Data":"d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb"} Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.238158 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" event={"ID":"d3b1cc40-9985-45d8-bb06-0676ff188c6c","Type":"ContainerStarted","Data":"9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf"} Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.241706 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" event={"ID":"7728f3b2-7258-444d-982b-10d416bb61f0","Type":"ContainerStarted","Data":"e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2"} Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.288550 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" podStartSLOduration=2.71358618 podStartE2EDuration="4.288531202s" podCreationTimestamp="2026-01-31 09:16:32 +0000 UTC" firstStartedPulling="2026-01-31 09:16:33.265351407 +0000 UTC m=+931.571227611" lastFinishedPulling="2026-01-31 09:16:34.840296429 +0000 UTC m=+933.146172633" observedRunningTime="2026-01-31 09:16:36.270971266 +0000 UTC m=+934.576847480" watchObservedRunningTime="2026-01-31 09:16:36.288531202 +0000 UTC m=+934.594407396" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.290216 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" podStartSLOduration=2.796252546 podStartE2EDuration="4.290208795s" podCreationTimestamp="2026-01-31 09:16:32 +0000 UTC" firstStartedPulling="2026-01-31 09:16:33.348980913 +0000 UTC m=+931.654857117" lastFinishedPulling="2026-01-31 09:16:34.842937172 +0000 UTC m=+933.148813366" observedRunningTime="2026-01-31 09:16:36.288046496 +0000 UTC m=+934.593922700" watchObservedRunningTime="2026-01-31 09:16:36.290208795 +0000 UTC m=+934.596084999" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.688458 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv"] Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.689916 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.691992 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-tnztr" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.704199 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv"] Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.746325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x8fp\" (UniqueName: \"kubernetes.io/projected/a8db73f4-a3fd-4276-87eb-69db3df2adb6-kube-api-access-9x8fp\") pod \"0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.746482 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-bundle\") pod \"0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.746517 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-util\") pod \"0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.848367 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x8fp\" (UniqueName: \"kubernetes.io/projected/a8db73f4-a3fd-4276-87eb-69db3df2adb6-kube-api-access-9x8fp\") pod \"0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.848494 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-bundle\") pod \"0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.848524 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-util\") pod \"0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.849322 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-util\") pod \"0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.849337 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-bundle\") pod \"0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:36 crc kubenswrapper[4732]: I0131 09:16:36.867273 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x8fp\" (UniqueName: \"kubernetes.io/projected/a8db73f4-a3fd-4276-87eb-69db3df2adb6-kube-api-access-9x8fp\") pod \"0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:37 crc kubenswrapper[4732]: I0131 09:16:37.039755 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:37 crc kubenswrapper[4732]: I0131 09:16:37.473462 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv"] Jan 31 09:16:38 crc kubenswrapper[4732]: I0131 09:16:38.264301 4732 generic.go:334] "Generic (PLEG): container finished" podID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerID="a86707b4de5b35e9337d2beb5deb8d861dead56d2dc194d07926deea0b9a63f5" exitCode=0 Jan 31 09:16:38 crc kubenswrapper[4732]: I0131 09:16:38.264421 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" event={"ID":"a8db73f4-a3fd-4276-87eb-69db3df2adb6","Type":"ContainerDied","Data":"a86707b4de5b35e9337d2beb5deb8d861dead56d2dc194d07926deea0b9a63f5"} Jan 31 09:16:38 crc kubenswrapper[4732]: I0131 09:16:38.264633 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" event={"ID":"a8db73f4-a3fd-4276-87eb-69db3df2adb6","Type":"ContainerStarted","Data":"77e123f88c883e88c388561aaf7732599ce5120cf4c9e11a5fc356e2e9d2a10f"} Jan 31 09:16:39 crc kubenswrapper[4732]: I0131 09:16:39.273324 4732 generic.go:334] "Generic (PLEG): container finished" podID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerID="61b0a7c08542a13be8ba2b2cae47287416caa788b605c317245cad2aa213e84a" exitCode=0 Jan 31 09:16:39 crc kubenswrapper[4732]: I0131 09:16:39.273372 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" event={"ID":"a8db73f4-a3fd-4276-87eb-69db3df2adb6","Type":"ContainerDied","Data":"61b0a7c08542a13be8ba2b2cae47287416caa788b605c317245cad2aa213e84a"} Jan 31 09:16:40 crc kubenswrapper[4732]: I0131 09:16:40.283164 4732 generic.go:334] "Generic (PLEG): container finished" podID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerID="7a57753b9625c7248d75214d3dccdb94e632617b9f7c21be3bc87c9177d4ca52" exitCode=0 Jan 31 09:16:40 crc kubenswrapper[4732]: I0131 09:16:40.283239 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" event={"ID":"a8db73f4-a3fd-4276-87eb-69db3df2adb6","Type":"ContainerDied","Data":"7a57753b9625c7248d75214d3dccdb94e632617b9f7c21be3bc87c9177d4ca52"} Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.649635 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.837793 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x8fp\" (UniqueName: \"kubernetes.io/projected/a8db73f4-a3fd-4276-87eb-69db3df2adb6-kube-api-access-9x8fp\") pod \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.838218 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-bundle\") pod \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.838317 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-util\") pod \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\" (UID: \"a8db73f4-a3fd-4276-87eb-69db3df2adb6\") " Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.839553 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-bundle" (OuterVolumeSpecName: "bundle") pod "a8db73f4-a3fd-4276-87eb-69db3df2adb6" (UID: "a8db73f4-a3fd-4276-87eb-69db3df2adb6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.847278 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8db73f4-a3fd-4276-87eb-69db3df2adb6-kube-api-access-9x8fp" (OuterVolumeSpecName: "kube-api-access-9x8fp") pod "a8db73f4-a3fd-4276-87eb-69db3df2adb6" (UID: "a8db73f4-a3fd-4276-87eb-69db3df2adb6"). InnerVolumeSpecName "kube-api-access-9x8fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.862044 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-util" (OuterVolumeSpecName: "util") pod "a8db73f4-a3fd-4276-87eb-69db3df2adb6" (UID: "a8db73f4-a3fd-4276-87eb-69db3df2adb6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.940146 4732 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.940181 4732 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a8db73f4-a3fd-4276-87eb-69db3df2adb6-util\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:41 crc kubenswrapper[4732]: I0131 09:16:41.940190 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x8fp\" (UniqueName: \"kubernetes.io/projected/a8db73f4-a3fd-4276-87eb-69db3df2adb6-kube-api-access-9x8fp\") on node \"crc\" DevicePath \"\"" Jan 31 09:16:42 crc kubenswrapper[4732]: I0131 09:16:42.300761 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" event={"ID":"a8db73f4-a3fd-4276-87eb-69db3df2adb6","Type":"ContainerDied","Data":"77e123f88c883e88c388561aaf7732599ce5120cf4c9e11a5fc356e2e9d2a10f"} Jan 31 09:16:42 crc kubenswrapper[4732]: I0131 09:16:42.300808 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77e123f88c883e88c388561aaf7732599ce5120cf4c9e11a5fc356e2e9d2a10f" Jan 31 09:16:42 crc kubenswrapper[4732]: I0131 09:16:42.300807 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv" Jan 31 09:16:44 crc kubenswrapper[4732]: I0131 09:16:44.497621 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:44 crc kubenswrapper[4732]: I0131 09:16:44.517608 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:16:47 crc kubenswrapper[4732]: I0131 09:16:47.498177 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:16:47 crc kubenswrapper[4732]: I0131 09:16:47.498476 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:16:49 crc kubenswrapper[4732]: I0131 09:16:49.010417 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.581875 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc"] Jan 31 09:16:52 crc kubenswrapper[4732]: E0131 09:16:52.582597 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerName="util" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.582610 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerName="util" Jan 31 09:16:52 crc kubenswrapper[4732]: E0131 09:16:52.582622 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerName="extract" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.582627 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerName="extract" Jan 31 09:16:52 crc kubenswrapper[4732]: E0131 09:16:52.582650 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerName="pull" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.582656 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerName="pull" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.582802 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" containerName="extract" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.583218 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.586434 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.586926 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zqjgk" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.601156 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc"] Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.609018 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-apiservice-cert\") pod \"swift-operator-controller-manager-6786c8c4f8-x5zqc\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.609082 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-webhook-cert\") pod \"swift-operator-controller-manager-6786c8c4f8-x5zqc\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.609115 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28txj\" (UniqueName: \"kubernetes.io/projected/17f93e90-1e9a-439c-a130-487ebf54ad10-kube-api-access-28txj\") pod \"swift-operator-controller-manager-6786c8c4f8-x5zqc\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.710608 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-apiservice-cert\") pod \"swift-operator-controller-manager-6786c8c4f8-x5zqc\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.710908 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-webhook-cert\") pod \"swift-operator-controller-manager-6786c8c4f8-x5zqc\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.711015 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28txj\" (UniqueName: \"kubernetes.io/projected/17f93e90-1e9a-439c-a130-487ebf54ad10-kube-api-access-28txj\") pod \"swift-operator-controller-manager-6786c8c4f8-x5zqc\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.716533 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-apiservice-cert\") pod \"swift-operator-controller-manager-6786c8c4f8-x5zqc\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.729454 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28txj\" (UniqueName: \"kubernetes.io/projected/17f93e90-1e9a-439c-a130-487ebf54ad10-kube-api-access-28txj\") pod \"swift-operator-controller-manager-6786c8c4f8-x5zqc\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.732162 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-webhook-cert\") pod \"swift-operator-controller-manager-6786c8c4f8-x5zqc\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:52 crc kubenswrapper[4732]: I0131 09:16:52.901097 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:53 crc kubenswrapper[4732]: I0131 09:16:53.337409 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc"] Jan 31 09:16:53 crc kubenswrapper[4732]: I0131 09:16:53.379320 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" event={"ID":"17f93e90-1e9a-439c-a130-487ebf54ad10","Type":"ContainerStarted","Data":"3190f0d353720c75142ebd0bfb06e439e4a2407802386eda349902b5c0a59659"} Jan 31 09:16:55 crc kubenswrapper[4732]: I0131 09:16:55.393170 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" event={"ID":"17f93e90-1e9a-439c-a130-487ebf54ad10","Type":"ContainerStarted","Data":"dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697"} Jan 31 09:16:55 crc kubenswrapper[4732]: I0131 09:16:55.393640 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:16:55 crc kubenswrapper[4732]: I0131 09:16:55.415355 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" podStartSLOduration=1.92508974 podStartE2EDuration="3.415334882s" podCreationTimestamp="2026-01-31 09:16:52 +0000 UTC" firstStartedPulling="2026-01-31 09:16:53.350261002 +0000 UTC m=+951.656137206" lastFinishedPulling="2026-01-31 09:16:54.840506144 +0000 UTC m=+953.146382348" observedRunningTime="2026-01-31 09:16:55.41401769 +0000 UTC m=+953.719893894" watchObservedRunningTime="2026-01-31 09:16:55.415334882 +0000 UTC m=+953.721211086" Jan 31 09:17:02 crc kubenswrapper[4732]: I0131 09:17:02.905392 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.258480 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.262899 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.265688 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.265756 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-54dnj" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.265994 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.266210 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.276806 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.414385 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt7ch\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-kube-api-access-pt7ch\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.414459 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.414514 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.414566 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-lock\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.414604 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-cache\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.515901 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.515965 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.516008 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-lock\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.516034 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-cache\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.516078 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt7ch\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-kube-api-access-pt7ch\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: E0131 09:17:07.516151 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:07 crc kubenswrapper[4732]: E0131 09:17:07.516202 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:07 crc kubenswrapper[4732]: E0131 09:17:07.516290 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift podName:410ee08c-4c6c-4012-aa46-264179923617 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:08.01626953 +0000 UTC m=+966.322145794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift") pod "swift-storage-0" (UID: "410ee08c-4c6c-4012-aa46-264179923617") : configmap "swift-ring-files" not found Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.516381 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") device mount path \"/mnt/openstack/pv06\"" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.516523 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-lock\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.516741 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-cache\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.536068 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.545697 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt7ch\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-kube-api-access-pt7ch\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.802128 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-thntz"] Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.805043 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.810126 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.810222 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.810368 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.816266 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-thntz"] Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.921478 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-ring-data-devices\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.921756 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/98f46fa4-b924-478d-a6d5-6070a4a75aee-etc-swift\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.921891 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-swiftconf\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.921986 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-scripts\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.922085 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-dispersionconf\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:07 crc kubenswrapper[4732]: I0131 09:17:07.922160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbpn\" (UniqueName: \"kubernetes.io/projected/98f46fa4-b924-478d-a6d5-6070a4a75aee-kube-api-access-dtbpn\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.023374 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-ring-data-devices\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.023463 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/98f46fa4-b924-478d-a6d5-6070a4a75aee-etc-swift\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.023493 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-swiftconf\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.023517 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-scripts\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.023548 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-dispersionconf\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.023571 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbpn\" (UniqueName: \"kubernetes.io/projected/98f46fa4-b924-478d-a6d5-6070a4a75aee-kube-api-access-dtbpn\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.023631 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:08 crc kubenswrapper[4732]: E0131 09:17:08.023798 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:08 crc kubenswrapper[4732]: E0131 09:17:08.023811 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:08 crc kubenswrapper[4732]: E0131 09:17:08.023852 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift podName:410ee08c-4c6c-4012-aa46-264179923617 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:09.02383847 +0000 UTC m=+967.329714674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift") pod "swift-storage-0" (UID: "410ee08c-4c6c-4012-aa46-264179923617") : configmap "swift-ring-files" not found Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.024229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-ring-data-devices\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.024489 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/98f46fa4-b924-478d-a6d5-6070a4a75aee-etc-swift\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.025446 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-scripts\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.045228 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-dispersionconf\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.050132 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-swiftconf\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.058191 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbpn\" (UniqueName: \"kubernetes.io/projected/98f46fa4-b924-478d-a6d5-6070a4a75aee-kube-api-access-dtbpn\") pod \"swift-ring-rebalance-thntz\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.127824 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.369789 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8"] Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.372323 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.377328 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8"] Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.531440 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-run-httpd\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.531505 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-log-httpd\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.531707 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k5xz\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-kube-api-access-5k5xz\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.531804 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.531892 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e00e76d-2f89-454c-be3b-855e8186c78e-config-data\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.585471 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-thntz"] Jan 31 09:17:08 crc kubenswrapper[4732]: W0131 09:17:08.590865 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98f46fa4_b924_478d_a6d5_6070a4a75aee.slice/crio-d4531d5f1fc9a03a2055376e4d89a22630b0cf87424acef5b26aef666e5742d5 WatchSource:0}: Error finding container d4531d5f1fc9a03a2055376e4d89a22630b0cf87424acef5b26aef666e5742d5: Status 404 returned error can't find the container with id d4531d5f1fc9a03a2055376e4d89a22630b0cf87424acef5b26aef666e5742d5 Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.633034 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-run-httpd\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.633281 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-log-httpd\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.633386 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k5xz\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-kube-api-access-5k5xz\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.633481 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.633569 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-run-httpd\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.633723 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e00e76d-2f89-454c-be3b-855e8186c78e-config-data\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: E0131 09:17:08.633750 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:08 crc kubenswrapper[4732]: E0131 09:17:08.633954 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8: configmap "swift-ring-files" not found Jan 31 09:17:08 crc kubenswrapper[4732]: E0131 09:17:08.634078 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift podName:9e00e76d-2f89-454c-be3b-855e8186c78e nodeName:}" failed. No retries permitted until 2026-01-31 09:17:09.134055907 +0000 UTC m=+967.439932111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift") pod "swift-proxy-7d8cf99555-f2jx8" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e") : configmap "swift-ring-files" not found Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.633986 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-log-httpd\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.642339 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e00e76d-2f89-454c-be3b-855e8186c78e-config-data\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:08 crc kubenswrapper[4732]: I0131 09:17:08.651773 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k5xz\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-kube-api-access-5k5xz\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:09 crc kubenswrapper[4732]: I0131 09:17:09.039791 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:09 crc kubenswrapper[4732]: E0131 09:17:09.040032 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:09 crc kubenswrapper[4732]: E0131 09:17:09.040051 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:09 crc kubenswrapper[4732]: E0131 09:17:09.040115 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift podName:410ee08c-4c6c-4012-aa46-264179923617 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:11.040095615 +0000 UTC m=+969.345971819 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift") pod "swift-storage-0" (UID: "410ee08c-4c6c-4012-aa46-264179923617") : configmap "swift-ring-files" not found Jan 31 09:17:09 crc kubenswrapper[4732]: I0131 09:17:09.140961 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:09 crc kubenswrapper[4732]: E0131 09:17:09.141191 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:09 crc kubenswrapper[4732]: E0131 09:17:09.141388 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8: configmap "swift-ring-files" not found Jan 31 09:17:09 crc kubenswrapper[4732]: E0131 09:17:09.141448 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift podName:9e00e76d-2f89-454c-be3b-855e8186c78e nodeName:}" failed. No retries permitted until 2026-01-31 09:17:10.141430531 +0000 UTC m=+968.447306735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift") pod "swift-proxy-7d8cf99555-f2jx8" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e") : configmap "swift-ring-files" not found Jan 31 09:17:09 crc kubenswrapper[4732]: I0131 09:17:09.510332 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-thntz" event={"ID":"98f46fa4-b924-478d-a6d5-6070a4a75aee","Type":"ContainerStarted","Data":"d4531d5f1fc9a03a2055376e4d89a22630b0cf87424acef5b26aef666e5742d5"} Jan 31 09:17:10 crc kubenswrapper[4732]: I0131 09:17:10.155336 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:10 crc kubenswrapper[4732]: E0131 09:17:10.155480 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:10 crc kubenswrapper[4732]: E0131 09:17:10.155496 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8: configmap "swift-ring-files" not found Jan 31 09:17:10 crc kubenswrapper[4732]: E0131 09:17:10.155547 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift podName:9e00e76d-2f89-454c-be3b-855e8186c78e nodeName:}" failed. No retries permitted until 2026-01-31 09:17:12.155531488 +0000 UTC m=+970.461407692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift") pod "swift-proxy-7d8cf99555-f2jx8" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e") : configmap "swift-ring-files" not found Jan 31 09:17:11 crc kubenswrapper[4732]: I0131 09:17:11.068350 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:11 crc kubenswrapper[4732]: E0131 09:17:11.068543 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:11 crc kubenswrapper[4732]: E0131 09:17:11.068854 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:11 crc kubenswrapper[4732]: E0131 09:17:11.068906 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift podName:410ee08c-4c6c-4012-aa46-264179923617 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:15.068890986 +0000 UTC m=+973.374767190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift") pod "swift-storage-0" (UID: "410ee08c-4c6c-4012-aa46-264179923617") : configmap "swift-ring-files" not found Jan 31 09:17:12 crc kubenswrapper[4732]: I0131 09:17:12.186262 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:12 crc kubenswrapper[4732]: E0131 09:17:12.187303 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:12 crc kubenswrapper[4732]: E0131 09:17:12.187336 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8: configmap "swift-ring-files" not found Jan 31 09:17:12 crc kubenswrapper[4732]: E0131 09:17:12.187419 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift podName:9e00e76d-2f89-454c-be3b-855e8186c78e nodeName:}" failed. No retries permitted until 2026-01-31 09:17:16.187390467 +0000 UTC m=+974.493266711 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift") pod "swift-proxy-7d8cf99555-f2jx8" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e") : configmap "swift-ring-files" not found Jan 31 09:17:12 crc kubenswrapper[4732]: I0131 09:17:12.536356 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-thntz" event={"ID":"98f46fa4-b924-478d-a6d5-6070a4a75aee","Type":"ContainerStarted","Data":"90c15d5ab54a0813e658db668ab02d84908650e855cd0db9d18d9e71000d2b80"} Jan 31 09:17:12 crc kubenswrapper[4732]: I0131 09:17:12.563052 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-thntz" podStartSLOduration=2.7230280430000002 podStartE2EDuration="5.563035202s" podCreationTimestamp="2026-01-31 09:17:07 +0000 UTC" firstStartedPulling="2026-01-31 09:17:08.598901925 +0000 UTC m=+966.904778129" lastFinishedPulling="2026-01-31 09:17:11.438909034 +0000 UTC m=+969.744785288" observedRunningTime="2026-01-31 09:17:12.558596801 +0000 UTC m=+970.864473045" watchObservedRunningTime="2026-01-31 09:17:12.563035202 +0000 UTC m=+970.868911406" Jan 31 09:17:15 crc kubenswrapper[4732]: I0131 09:17:15.133516 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:15 crc kubenswrapper[4732]: E0131 09:17:15.133824 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:15 crc kubenswrapper[4732]: E0131 09:17:15.134122 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:17:15 crc kubenswrapper[4732]: E0131 09:17:15.134213 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift podName:410ee08c-4c6c-4012-aa46-264179923617 nodeName:}" failed. No retries permitted until 2026-01-31 09:17:23.134183944 +0000 UTC m=+981.440060178 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift") pod "swift-storage-0" (UID: "410ee08c-4c6c-4012-aa46-264179923617") : configmap "swift-ring-files" not found Jan 31 09:17:16 crc kubenswrapper[4732]: I0131 09:17:16.251605 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:16 crc kubenswrapper[4732]: E0131 09:17:16.251827 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:17:16 crc kubenswrapper[4732]: E0131 09:17:16.251842 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8: configmap "swift-ring-files" not found Jan 31 09:17:16 crc kubenswrapper[4732]: E0131 09:17:16.251896 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift podName:9e00e76d-2f89-454c-be3b-855e8186c78e nodeName:}" failed. No retries permitted until 2026-01-31 09:17:24.251878124 +0000 UTC m=+982.557754328 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift") pod "swift-proxy-7d8cf99555-f2jx8" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e") : configmap "swift-ring-files" not found Jan 31 09:17:17 crc kubenswrapper[4732]: I0131 09:17:17.497723 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:17:17 crc kubenswrapper[4732]: I0131 09:17:17.498050 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:17:17 crc kubenswrapper[4732]: I0131 09:17:17.498098 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:17:17 crc kubenswrapper[4732]: I0131 09:17:17.498721 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7aa4af54c816ede2288939b5eacffccc23edb9afc7e2a36ef42fe01d52b4ae91"} pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:17:17 crc kubenswrapper[4732]: I0131 09:17:17.498777 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" containerID="cri-o://7aa4af54c816ede2288939b5eacffccc23edb9afc7e2a36ef42fe01d52b4ae91" gracePeriod=600 Jan 31 09:17:18 crc kubenswrapper[4732]: I0131 09:17:18.586163 4732 generic.go:334] "Generic (PLEG): container finished" podID="98f46fa4-b924-478d-a6d5-6070a4a75aee" containerID="90c15d5ab54a0813e658db668ab02d84908650e855cd0db9d18d9e71000d2b80" exitCode=0 Jan 31 09:17:18 crc kubenswrapper[4732]: I0131 09:17:18.586242 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-thntz" event={"ID":"98f46fa4-b924-478d-a6d5-6070a4a75aee","Type":"ContainerDied","Data":"90c15d5ab54a0813e658db668ab02d84908650e855cd0db9d18d9e71000d2b80"} Jan 31 09:17:18 crc kubenswrapper[4732]: I0131 09:17:18.597768 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d790207-d357-4b47-87bf-5b505e061820" containerID="7aa4af54c816ede2288939b5eacffccc23edb9afc7e2a36ef42fe01d52b4ae91" exitCode=0 Jan 31 09:17:18 crc kubenswrapper[4732]: I0131 09:17:18.597822 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerDied","Data":"7aa4af54c816ede2288939b5eacffccc23edb9afc7e2a36ef42fe01d52b4ae91"} Jan 31 09:17:18 crc kubenswrapper[4732]: I0131 09:17:18.597855 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"99271a603de3a603b9be8d8f0bb791de0f202646de27403a5e3efc59790f637f"} Jan 31 09:17:18 crc kubenswrapper[4732]: I0131 09:17:18.597872 4732 scope.go:117] "RemoveContainer" containerID="e8d3fd1eb561cfd678a2a0df1de54d984c12dc8e05f74e816b693d4b18b74a20" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.000100 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.030102 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/98f46fa4-b924-478d-a6d5-6070a4a75aee-etc-swift\") pod \"98f46fa4-b924-478d-a6d5-6070a4a75aee\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.030164 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-dispersionconf\") pod \"98f46fa4-b924-478d-a6d5-6070a4a75aee\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.030259 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-swiftconf\") pod \"98f46fa4-b924-478d-a6d5-6070a4a75aee\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.030295 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-scripts\") pod \"98f46fa4-b924-478d-a6d5-6070a4a75aee\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.030318 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-ring-data-devices\") pod \"98f46fa4-b924-478d-a6d5-6070a4a75aee\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.030353 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtbpn\" (UniqueName: \"kubernetes.io/projected/98f46fa4-b924-478d-a6d5-6070a4a75aee-kube-api-access-dtbpn\") pod \"98f46fa4-b924-478d-a6d5-6070a4a75aee\" (UID: \"98f46fa4-b924-478d-a6d5-6070a4a75aee\") " Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.032708 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "98f46fa4-b924-478d-a6d5-6070a4a75aee" (UID: "98f46fa4-b924-478d-a6d5-6070a4a75aee"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.036586 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f46fa4-b924-478d-a6d5-6070a4a75aee-kube-api-access-dtbpn" (OuterVolumeSpecName: "kube-api-access-dtbpn") pod "98f46fa4-b924-478d-a6d5-6070a4a75aee" (UID: "98f46fa4-b924-478d-a6d5-6070a4a75aee"). InnerVolumeSpecName "kube-api-access-dtbpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.033175 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f46fa4-b924-478d-a6d5-6070a4a75aee-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "98f46fa4-b924-478d-a6d5-6070a4a75aee" (UID: "98f46fa4-b924-478d-a6d5-6070a4a75aee"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.051205 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-scripts" (OuterVolumeSpecName: "scripts") pod "98f46fa4-b924-478d-a6d5-6070a4a75aee" (UID: "98f46fa4-b924-478d-a6d5-6070a4a75aee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.054195 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "98f46fa4-b924-478d-a6d5-6070a4a75aee" (UID: "98f46fa4-b924-478d-a6d5-6070a4a75aee"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.060818 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "98f46fa4-b924-478d-a6d5-6070a4a75aee" (UID: "98f46fa4-b924-478d-a6d5-6070a4a75aee"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.132078 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.132107 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/98f46fa4-b924-478d-a6d5-6070a4a75aee-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.132117 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.132125 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/98f46fa4-b924-478d-a6d5-6070a4a75aee-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.132134 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtbpn\" (UniqueName: \"kubernetes.io/projected/98f46fa4-b924-478d-a6d5-6070a4a75aee-kube-api-access-dtbpn\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.132145 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/98f46fa4-b924-478d-a6d5-6070a4a75aee-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.616716 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-thntz" event={"ID":"98f46fa4-b924-478d-a6d5-6070a4a75aee","Type":"ContainerDied","Data":"d4531d5f1fc9a03a2055376e4d89a22630b0cf87424acef5b26aef666e5742d5"} Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.616775 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4531d5f1fc9a03a2055376e4d89a22630b0cf87424acef5b26aef666e5742d5" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.616739 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-thntz" Jan 31 09:17:20 crc kubenswrapper[4732]: I0131 09:17:20.864692 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:22 crc kubenswrapper[4732]: I0131 09:17:22.433846 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:23 crc kubenswrapper[4732]: I0131 09:17:23.173155 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:23 crc kubenswrapper[4732]: I0131 09:17:23.180196 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") pod \"swift-storage-0\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:23 crc kubenswrapper[4732]: I0131 09:17:23.478198 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:17:24 crc kubenswrapper[4732]: I0131 09:17:23.999892 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:24 crc kubenswrapper[4732]: I0131 09:17:24.005843 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:17:24 crc kubenswrapper[4732]: I0131 09:17:24.294478 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:24 crc kubenswrapper[4732]: I0131 09:17:24.301069 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") pod \"swift-proxy-7d8cf99555-f2jx8\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:24 crc kubenswrapper[4732]: I0131 09:17:24.587845 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:24 crc kubenswrapper[4732]: I0131 09:17:24.669798 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"bd56bd5520be61343a090a3833bf69ab0722a78470f84b63a8f6a8b06d85cd3e"} Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.063483 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8"] Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.540216 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.690292 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" event={"ID":"9e00e76d-2f89-454c-be3b-855e8186c78e","Type":"ContainerStarted","Data":"9fc5b88fc650585c4cf7595095cbcdfc58d2b17c1bb6d6f6621c5213c78ab1f0"} Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.690377 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" event={"ID":"9e00e76d-2f89-454c-be3b-855e8186c78e","Type":"ContainerStarted","Data":"5603cb6f58751f09a186156f4f600f7ee6585d6eeac0b6ca57c68aa2fa8a4f68"} Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.690409 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" event={"ID":"9e00e76d-2f89-454c-be3b-855e8186c78e","Type":"ContainerStarted","Data":"70a5df12fcc55bf7e0349357dc9e7a70341a9c65f4cf5241e077476ae04d6820"} Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.691820 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.691871 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.697153 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"dbfd2c11334695f732524b533be449dd61ebf7d7637f32b3bcfe1d1941b37863"} Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.697195 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"b63fa3de54dc547aaba5e3c717d85f707b81c7144e50e21b9e827db634951724"} Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.697211 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"61cb3b1d16a1d1d363ee3738835e89a0f2f6d3986bf10918c00e484f7f8c9ef7"} Jan 31 09:17:25 crc kubenswrapper[4732]: I0131 09:17:25.716741 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" podStartSLOduration=17.716657599 podStartE2EDuration="17.716657599s" podCreationTimestamp="2026-01-31 09:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:17:25.709561409 +0000 UTC m=+984.015437633" watchObservedRunningTime="2026-01-31 09:17:25.716657599 +0000 UTC m=+984.022533843" Jan 31 09:17:26 crc kubenswrapper[4732]: I0131 09:17:26.705256 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"7f58b9ade96eae3a4c9b405f187c769232309e0c1fb241bf5444c4c304ab649f"} Jan 31 09:17:26 crc kubenswrapper[4732]: I0131 09:17:26.705569 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"6a55e95bf79b05700a43bb7800d35ec7f862e81feb1caac0721521392f5f8e7f"} Jan 31 09:17:27 crc kubenswrapper[4732]: I0131 09:17:27.134828 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:27 crc kubenswrapper[4732]: I0131 09:17:27.717623 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"20b5017b8e043011a79ad2661ea73a50a94d52bc115728a4ab154b985c7430df"} Jan 31 09:17:27 crc kubenswrapper[4732]: I0131 09:17:27.717709 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"73d8c40ae242fa0366cc532a47eefd15e47b7f5aeeb79eb1c8113e496beaf6e2"} Jan 31 09:17:27 crc kubenswrapper[4732]: I0131 09:17:27.717729 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"2ab9eec16ab6fbc15e432a8ef644dbb519f0a4dd256590fb6f9676ffceba4611"} Jan 31 09:17:28 crc kubenswrapper[4732]: I0131 09:17:28.731721 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"5e6d65fa1c26ce7ebf7ff25c277b074ca08845044a94dbdd6b32ff300ac68b7e"} Jan 31 09:17:28 crc kubenswrapper[4732]: I0131 09:17:28.732003 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"b045d34930a290c72208028dc6d1d6b3535e4b8818d51998647df06b3f7c2bdc"} Jan 31 09:17:28 crc kubenswrapper[4732]: I0131 09:17:28.732016 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"288678dcae1157f1c7ebe15c590533bc3accdd196dbe20808122f33180c2092e"} Jan 31 09:17:28 crc kubenswrapper[4732]: I0131 09:17:28.732025 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"7907616b4b6ec30fa97533949de2bee3dfcee1ddac0b222cb83f6c75835a1755"} Jan 31 09:17:28 crc kubenswrapper[4732]: I0131 09:17:28.745346 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:29 crc kubenswrapper[4732]: I0131 09:17:29.748017 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"81f782a2d54156b4990bcbb2dab091aeb970934a10ac85e7abc67c813255ffee"} Jan 31 09:17:29 crc kubenswrapper[4732]: I0131 09:17:29.748060 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"db1b9074ef0795f714675af65f50c4b87204369e4a7ad86e330547b5ab8f05fb"} Jan 31 09:17:29 crc kubenswrapper[4732]: I0131 09:17:29.748070 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerStarted","Data":"7f3c24b1866708a685d278257ae506a6fd79ff0289067bb9f41d1ad991336cc6"} Jan 31 09:17:29 crc kubenswrapper[4732]: I0131 09:17:29.787436 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=19.813508143 podStartE2EDuration="23.787415409s" podCreationTimestamp="2026-01-31 09:17:06 +0000 UTC" firstStartedPulling="2026-01-31 09:17:24.006800774 +0000 UTC m=+982.312676978" lastFinishedPulling="2026-01-31 09:17:27.98070804 +0000 UTC m=+986.286584244" observedRunningTime="2026-01-31 09:17:29.78136795 +0000 UTC m=+988.087244194" watchObservedRunningTime="2026-01-31 09:17:29.787415409 +0000 UTC m=+988.093291613" Jan 31 09:17:30 crc kubenswrapper[4732]: I0131 09:17:30.278365 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:31 crc kubenswrapper[4732]: I0131 09:17:31.817340 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:33 crc kubenswrapper[4732]: I0131 09:17:33.424969 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:34 crc kubenswrapper[4732]: I0131 09:17:34.590230 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:34 crc kubenswrapper[4732]: I0131 09:17:34.591693 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:17:34 crc kubenswrapper[4732]: I0131 09:17:34.979111 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:36 crc kubenswrapper[4732]: I0131 09:17:36.538219 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-thntz_98f46fa4-b924-478d-a6d5-6070a4a75aee/swift-ring-rebalance/0.log" Jan 31 09:17:37 crc kubenswrapper[4732]: I0131 09:17:37.904122 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:17:37 crc kubenswrapper[4732]: E0131 09:17:37.904514 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f46fa4-b924-478d-a6d5-6070a4a75aee" containerName="swift-ring-rebalance" Jan 31 09:17:37 crc kubenswrapper[4732]: I0131 09:17:37.904529 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f46fa4-b924-478d-a6d5-6070a4a75aee" containerName="swift-ring-rebalance" Jan 31 09:17:37 crc kubenswrapper[4732]: I0131 09:17:37.904741 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f46fa4-b924-478d-a6d5-6070a4a75aee" containerName="swift-ring-rebalance" Jan 31 09:17:37 crc kubenswrapper[4732]: I0131 09:17:37.908718 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:37 crc kubenswrapper[4732]: I0131 09:17:37.910161 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:17:37 crc kubenswrapper[4732]: I0131 09:17:37.915453 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:37 crc kubenswrapper[4732]: I0131 09:17:37.924464 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:17:37 crc kubenswrapper[4732]: I0131 09:17:37.934047 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.023924 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.024012 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-etc-swift\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.024049 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gtpt\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-kube-api-access-5gtpt\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.024066 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-cache\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.024083 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-etc-swift\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.024096 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-lock\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.024120 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-lock\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.024160 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-cache\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.024184 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqsrn\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-kube-api-access-kqsrn\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.024222 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.125989 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126044 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-etc-swift\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126085 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gtpt\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-kube-api-access-5gtpt\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126121 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-cache\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126142 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-etc-swift\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126159 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-lock\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126190 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-lock\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126232 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-cache\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126260 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqsrn\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-kube-api-access-kqsrn\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126304 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126423 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") device mount path \"/mnt/openstack/pv03\"" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.126502 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") device mount path \"/mnt/openstack/pv10\"" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.127241 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-lock\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.127242 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-cache\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.127263 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-cache\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.127309 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-lock\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.145690 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-etc-swift\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.145971 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-etc-swift\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.150023 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqsrn\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-kube-api-access-kqsrn\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.150149 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gtpt\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-kube-api-access-5gtpt\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.150993 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-1\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.154886 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-2\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.248789 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.260069 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.726330 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:17:38 crc kubenswrapper[4732]: W0131 09:17:38.735063 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod485e2c17_77f1_4b13_ad2a_1afe1034b82e.slice/crio-74e38e927b7f9f5222a587afffa38799ef87d240853e804c1d4e908457454209 WatchSource:0}: Error finding container 74e38e927b7f9f5222a587afffa38799ef87d240853e804c1d4e908457454209: Status 404 returned error can't find the container with id 74e38e927b7f9f5222a587afffa38799ef87d240853e804c1d4e908457454209 Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.803071 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:17:38 crc kubenswrapper[4732]: W0131 09:17:38.804500 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20196d3e_600c_4a25_97ef_86f81bfae43b.slice/crio-87f5d0b315407f4d9f600594a48f694f95c4330a184853c9d3591930eb2df4a1 WatchSource:0}: Error finding container 87f5d0b315407f4d9f600594a48f694f95c4330a184853c9d3591930eb2df4a1: Status 404 returned error can't find the container with id 87f5d0b315407f4d9f600594a48f694f95c4330a184853c9d3591930eb2df4a1 Jan 31 09:17:38 crc kubenswrapper[4732]: I0131 09:17:38.810117 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"74e38e927b7f9f5222a587afffa38799ef87d240853e804c1d4e908457454209"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.820697 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"3ca9c58e45467d768a6b9a7ce40fa6bc0ba4cec3710eae69487dc0106ba42c67"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.821069 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"f6d112eeaf369a7cae9a04806380ef040d7989006ab740a5b0d945ef5f7f317b"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.821085 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"5d118436256aaad4b93c230d8fa0456e8cc6f3c5ca6ce4c9d7f30f8db3c450ff"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.821098 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"bcb9d265c1a8c03fc26ba3b2f05fa7fcceb6cc59b711a2e0dd620cf846db5469"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.821112 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"f47b66d5b1e0a89ad9b955fbf6dde3f0b21e522e86e33f379a1e2379b7919989"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.821123 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"87f5d0b315407f4d9f600594a48f694f95c4330a184853c9d3591930eb2df4a1"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.825858 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"0630d2ca7e09803aac756e9568f7669b86f328b502c8fbec24c1ab333009da4d"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.825907 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"f72f8dedea2940a6aab6b6081a6ac32f4fe8a817a636c7292b3740e840aacaf8"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.825921 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"069a866bbc2edc4e0416ac1351e656e79b93160d557be633a57aa05933c9d37a"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.825934 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"0eb6a32653dab2879fd0e39ad42f712d86e6b8054d3507176f2d9dfda652811b"} Jan 31 09:17:39 crc kubenswrapper[4732]: I0131 09:17:39.825945 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"ef329846cba5f043851b0fac85462ce13c21ac2e48891f331c0b61745c33b6de"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.842145 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"0939cdb203f97f24dcd9e0d0769010bdb9148423fa723be6da835b8c3fcd94a3"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.842467 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"f59d6e8e79e5f65eaab0514cd4a4f2da1508f44198f23637971d0e31f2b0f2be"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.842478 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"6c6f43f8261cf2448edf43a43564636ba04ed0b128e0df16f2584d3972a973a9"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.842487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"b26aaf7029934891de5d0e92f8463ca1f8e106c401d898590351656c24bbc6e8"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.842495 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"ecfb0c165d8f537d03e0e64cf12e24a1575f070da3670ea4769b182ee59dcc20"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.847881 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"23a3dbdcb5e8c0d9b5e2b66e270ff60b28361fccb19ac39e68496d454544c773"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.847920 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"95aedeb15c9b618726f462f7211b9b97027622f3a2b198dc9717394e81868db5"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.847931 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"dcca573b69212cf173983d6090be0a12978f5decb7c01cbd3779f1d4d2a6b0d0"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.847942 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"de878ffff4963739e745def44cb56f453584bcd3f75e8cb3d9b85f0ca8d4e512"} Jan 31 09:17:40 crc kubenswrapper[4732]: I0131 09:17:40.847951 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"750de579c99e37a98c19df2a11fee436a72f515c72747a3e91a0abc97cd07385"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.852227 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-thntz"] Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.858254 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-thntz"] Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.864431 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"07ba04fcdc461974d0c6b09ac9da6b0c6610f36749f50f5eb47e063be68f5291"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.864648 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"3791aa9fdd8ee5b8255c509c99988f621034603e232b3c13e0287a092538c4df"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.864747 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"bd9e9b782c4457e3bdc1029eec72a89f862eb48d009287030b3d15352e63d945"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.864816 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"aba53546aab560ce992f9ed9d7c70a191eee00bcb246cc44c3bcf9dc6c6a964d"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.864885 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerStarted","Data":"87b04e76193dc458b5e96db200183f64f7cda5171af7bf59a9b3b47ca7c00582"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.869637 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"29d6915d3be45a209958c78f32e6cbfa5546c65adc3eb566381972012beea587"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.869705 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"2038dec3545765c898c51198c887beb686d603d6257cd634255dc773c64b0920"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.869724 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"171dd93dab917d5e3ad3010e206ac5f2aa6f0312273716ac344a73726703fbda"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.869739 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"1839639d5b9952c70ff36e7c3b972e786bbfb15314211dc0fe15a6700ac4d580"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.869750 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerStarted","Data":"17809bd753b75791dd2fa65451f79147382e0f41d6052d606f342c4037233a9a"} Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.875128 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pdbq2"] Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.876006 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.877692 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.878311 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.889363 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pdbq2"] Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.923472 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-2" podStartSLOduration=5.923452954 podStartE2EDuration="5.923452954s" podCreationTimestamp="2026-01-31 09:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:17:41.922530576 +0000 UTC m=+1000.228406790" watchObservedRunningTime="2026-01-31 09:17:41.923452954 +0000 UTC m=+1000.229329158" Jan 31 09:17:41 crc kubenswrapper[4732]: I0131 09:17:41.969032 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-1" podStartSLOduration=5.969011742 podStartE2EDuration="5.969011742s" podCreationTimestamp="2026-01-31 09:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:17:41.962245681 +0000 UTC m=+1000.268121895" watchObservedRunningTime="2026-01-31 09:17:41.969011742 +0000 UTC m=+1000.274887946" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.029563 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-scripts\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.029625 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-ring-data-devices\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.029947 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4ml7\" (UniqueName: \"kubernetes.io/projected/78c16da3-2938-49d9-b36d-3d71fe0d48f3-kube-api-access-j4ml7\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.030128 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-swiftconf\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.030223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c16da3-2938-49d9-b36d-3d71fe0d48f3-etc-swift\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.030338 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-dispersionconf\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.132229 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-scripts\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.132285 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-ring-data-devices\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.132326 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4ml7\" (UniqueName: \"kubernetes.io/projected/78c16da3-2938-49d9-b36d-3d71fe0d48f3-kube-api-access-j4ml7\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.132383 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-swiftconf\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.132422 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c16da3-2938-49d9-b36d-3d71fe0d48f3-etc-swift\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.132451 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-dispersionconf\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.133362 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-scripts\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.133645 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-ring-data-devices\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.133912 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c16da3-2938-49d9-b36d-3d71fe0d48f3-etc-swift\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.138742 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-dispersionconf\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.138756 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-swiftconf\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.161851 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4ml7\" (UniqueName: \"kubernetes.io/projected/78c16da3-2938-49d9-b36d-3d71fe0d48f3-kube-api-access-j4ml7\") pod \"swift-ring-rebalance-pdbq2\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.201299 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.406571 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pdbq2"] Jan 31 09:17:42 crc kubenswrapper[4732]: W0131 09:17:42.409736 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78c16da3_2938_49d9_b36d_3d71fe0d48f3.slice/crio-5d9c9f7c690ec50a833bec1f866ba679d131888630ee0174c0ab8f67646dad3c WatchSource:0}: Error finding container 5d9c9f7c690ec50a833bec1f866ba679d131888630ee0174c0ab8f67646dad3c: Status 404 returned error can't find the container with id 5d9c9f7c690ec50a833bec1f866ba679d131888630ee0174c0ab8f67646dad3c Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.561894 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f46fa4-b924-478d-a6d5-6070a4a75aee" path="/var/lib/kubelet/pods/98f46fa4-b924-478d-a6d5-6070a4a75aee/volumes" Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.886074 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" event={"ID":"78c16da3-2938-49d9-b36d-3d71fe0d48f3","Type":"ContainerStarted","Data":"34e2cf0f4161fbb85eb0df7049a2e4f5c156fd2607d987c6bef858acc9fffa46"} Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.886613 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" event={"ID":"78c16da3-2938-49d9-b36d-3d71fe0d48f3","Type":"ContainerStarted","Data":"5d9c9f7c690ec50a833bec1f866ba679d131888630ee0174c0ab8f67646dad3c"} Jan 31 09:17:42 crc kubenswrapper[4732]: I0131 09:17:42.925863 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" podStartSLOduration=1.9258382059999999 podStartE2EDuration="1.925838206s" podCreationTimestamp="2026-01-31 09:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:17:42.911420707 +0000 UTC m=+1001.217296961" watchObservedRunningTime="2026-01-31 09:17:42.925838206 +0000 UTC m=+1001.231714450" Jan 31 09:17:50 crc kubenswrapper[4732]: I0131 09:17:50.955173 4732 generic.go:334] "Generic (PLEG): container finished" podID="78c16da3-2938-49d9-b36d-3d71fe0d48f3" containerID="34e2cf0f4161fbb85eb0df7049a2e4f5c156fd2607d987c6bef858acc9fffa46" exitCode=0 Jan 31 09:17:50 crc kubenswrapper[4732]: I0131 09:17:50.955764 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" event={"ID":"78c16da3-2938-49d9-b36d-3d71fe0d48f3","Type":"ContainerDied","Data":"34e2cf0f4161fbb85eb0df7049a2e4f5c156fd2607d987c6bef858acc9fffa46"} Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.292298 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.388791 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-scripts\") pod \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.388864 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-dispersionconf\") pod \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.388894 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-ring-data-devices\") pod \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.388925 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c16da3-2938-49d9-b36d-3d71fe0d48f3-etc-swift\") pod \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.389253 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-swiftconf\") pod \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.389285 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4ml7\" (UniqueName: \"kubernetes.io/projected/78c16da3-2938-49d9-b36d-3d71fe0d48f3-kube-api-access-j4ml7\") pod \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\" (UID: \"78c16da3-2938-49d9-b36d-3d71fe0d48f3\") " Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.390637 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "78c16da3-2938-49d9-b36d-3d71fe0d48f3" (UID: "78c16da3-2938-49d9-b36d-3d71fe0d48f3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.391649 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c16da3-2938-49d9-b36d-3d71fe0d48f3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "78c16da3-2938-49d9-b36d-3d71fe0d48f3" (UID: "78c16da3-2938-49d9-b36d-3d71fe0d48f3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.405103 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c16da3-2938-49d9-b36d-3d71fe0d48f3-kube-api-access-j4ml7" (OuterVolumeSpecName: "kube-api-access-j4ml7") pod "78c16da3-2938-49d9-b36d-3d71fe0d48f3" (UID: "78c16da3-2938-49d9-b36d-3d71fe0d48f3"). InnerVolumeSpecName "kube-api-access-j4ml7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.417910 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-scripts" (OuterVolumeSpecName: "scripts") pod "78c16da3-2938-49d9-b36d-3d71fe0d48f3" (UID: "78c16da3-2938-49d9-b36d-3d71fe0d48f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.423850 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "78c16da3-2938-49d9-b36d-3d71fe0d48f3" (UID: "78c16da3-2938-49d9-b36d-3d71fe0d48f3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.426219 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "78c16da3-2938-49d9-b36d-3d71fe0d48f3" (UID: "78c16da3-2938-49d9-b36d-3d71fe0d48f3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.491408 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.491460 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.491471 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78c16da3-2938-49d9-b36d-3d71fe0d48f3-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.491482 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78c16da3-2938-49d9-b36d-3d71fe0d48f3-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.491493 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4ml7\" (UniqueName: \"kubernetes.io/projected/78c16da3-2938-49d9-b36d-3d71fe0d48f3-kube-api-access-j4ml7\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.491507 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c16da3-2938-49d9-b36d-3d71fe0d48f3-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.978122 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" event={"ID":"78c16da3-2938-49d9-b36d-3d71fe0d48f3","Type":"ContainerDied","Data":"5d9c9f7c690ec50a833bec1f866ba679d131888630ee0174c0ab8f67646dad3c"} Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.978180 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d9c9f7c690ec50a833bec1f866ba679d131888630ee0174c0ab8f67646dad3c" Jan 31 09:17:52 crc kubenswrapper[4732]: I0131 09:17:52.978502 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-pdbq2" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.220029 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b"] Jan 31 09:17:53 crc kubenswrapper[4732]: E0131 09:17:53.220483 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c16da3-2938-49d9-b36d-3d71fe0d48f3" containerName="swift-ring-rebalance" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.220510 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c16da3-2938-49d9-b36d-3d71fe0d48f3" containerName="swift-ring-rebalance" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.220779 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c16da3-2938-49d9-b36d-3d71fe0d48f3" containerName="swift-ring-rebalance" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.221544 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.224154 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.224239 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.229564 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b"] Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.302987 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-dispersionconf\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.303076 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-swiftconf\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.303343 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b18c4acb-f5c3-45ef-b69a-a324e8f46803-etc-swift\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.303491 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-ring-data-devices\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.303584 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mgb8\" (UniqueName: \"kubernetes.io/projected/b18c4acb-f5c3-45ef-b69a-a324e8f46803-kube-api-access-8mgb8\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.303649 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-scripts\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.405708 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-ring-data-devices\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.405762 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mgb8\" (UniqueName: \"kubernetes.io/projected/b18c4acb-f5c3-45ef-b69a-a324e8f46803-kube-api-access-8mgb8\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.405789 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-scripts\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.405865 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-dispersionconf\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.405893 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-swiftconf\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.405927 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b18c4acb-f5c3-45ef-b69a-a324e8f46803-etc-swift\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.406500 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b18c4acb-f5c3-45ef-b69a-a324e8f46803-etc-swift\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.406501 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-scripts\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.406774 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-ring-data-devices\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.411131 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-dispersionconf\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.411955 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-swiftconf\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.423190 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mgb8\" (UniqueName: \"kubernetes.io/projected/b18c4acb-f5c3-45ef-b69a-a324e8f46803-kube-api-access-8mgb8\") pod \"swift-ring-rebalance-debug-kqn2b\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.581493 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.841721 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b"] Jan 31 09:17:53 crc kubenswrapper[4732]: I0131 09:17:53.989235 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" event={"ID":"b18c4acb-f5c3-45ef-b69a-a324e8f46803","Type":"ContainerStarted","Data":"28b08967c440d2c1d91bd3c1c47e2882180961c7b75bf5e786868b7ee1372640"} Jan 31 09:17:54 crc kubenswrapper[4732]: I0131 09:17:54.999695 4732 generic.go:334] "Generic (PLEG): container finished" podID="b18c4acb-f5c3-45ef-b69a-a324e8f46803" containerID="3c637fef25b194b9f5bca1b27d2b6a30e6bc6beb823cf5161054f65dd6ad4520" exitCode=0 Jan 31 09:17:55 crc kubenswrapper[4732]: I0131 09:17:54.999791 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" event={"ID":"b18c4acb-f5c3-45ef-b69a-a324e8f46803","Type":"ContainerDied","Data":"3c637fef25b194b9f5bca1b27d2b6a30e6bc6beb823cf5161054f65dd6ad4520"} Jan 31 09:17:55 crc kubenswrapper[4732]: I0131 09:17:55.044237 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b"] Jan 31 09:17:55 crc kubenswrapper[4732]: I0131 09:17:55.056021 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b"] Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.264908 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.384562 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-ring-data-devices\") pod \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.384737 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mgb8\" (UniqueName: \"kubernetes.io/projected/b18c4acb-f5c3-45ef-b69a-a324e8f46803-kube-api-access-8mgb8\") pod \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.384867 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-swiftconf\") pod \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.384930 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b18c4acb-f5c3-45ef-b69a-a324e8f46803-etc-swift\") pod \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.385038 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-scripts\") pod \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.385136 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-dispersionconf\") pod \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\" (UID: \"b18c4acb-f5c3-45ef-b69a-a324e8f46803\") " Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.385448 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b18c4acb-f5c3-45ef-b69a-a324e8f46803" (UID: "b18c4acb-f5c3-45ef-b69a-a324e8f46803"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.386625 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b18c4acb-f5c3-45ef-b69a-a324e8f46803-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b18c4acb-f5c3-45ef-b69a-a324e8f46803" (UID: "b18c4acb-f5c3-45ef-b69a-a324e8f46803"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.392258 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b18c4acb-f5c3-45ef-b69a-a324e8f46803-kube-api-access-8mgb8" (OuterVolumeSpecName: "kube-api-access-8mgb8") pod "b18c4acb-f5c3-45ef-b69a-a324e8f46803" (UID: "b18c4acb-f5c3-45ef-b69a-a324e8f46803"). InnerVolumeSpecName "kube-api-access-8mgb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.411776 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b18c4acb-f5c3-45ef-b69a-a324e8f46803" (UID: "b18c4acb-f5c3-45ef-b69a-a324e8f46803"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.414600 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-scripts" (OuterVolumeSpecName: "scripts") pod "b18c4acb-f5c3-45ef-b69a-a324e8f46803" (UID: "b18c4acb-f5c3-45ef-b69a-a324e8f46803"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.422792 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b18c4acb-f5c3-45ef-b69a-a324e8f46803" (UID: "b18c4acb-f5c3-45ef-b69a-a324e8f46803"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.486760 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh"] Jan 31 09:17:56 crc kubenswrapper[4732]: E0131 09:17:56.487115 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b18c4acb-f5c3-45ef-b69a-a324e8f46803" containerName="swift-ring-rebalance" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.487140 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b18c4acb-f5c3-45ef-b69a-a324e8f46803" containerName="swift-ring-rebalance" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.487394 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b18c4acb-f5c3-45ef-b69a-a324e8f46803" containerName="swift-ring-rebalance" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.487734 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.487775 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.487794 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b18c4acb-f5c3-45ef-b69a-a324e8f46803-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.487807 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mgb8\" (UniqueName: \"kubernetes.io/projected/b18c4acb-f5c3-45ef-b69a-a324e8f46803-kube-api-access-8mgb8\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.487821 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b18c4acb-f5c3-45ef-b69a-a324e8f46803-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.487832 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b18c4acb-f5c3-45ef-b69a-a324e8f46803-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.487999 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.502949 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh"] Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.551450 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b18c4acb-f5c3-45ef-b69a-a324e8f46803" path="/var/lib/kubelet/pods/b18c4acb-f5c3-45ef-b69a-a324e8f46803/volumes" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.589336 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-scripts\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.589385 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-dispersionconf\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.589407 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-ring-data-devices\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.589547 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3323fc69-96ba-4767-aca9-a094ee4511fa-etc-swift\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.589595 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-swiftconf\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.589635 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zd8h\" (UniqueName: \"kubernetes.io/projected/3323fc69-96ba-4767-aca9-a094ee4511fa-kube-api-access-4zd8h\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.691229 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-scripts\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.691320 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-dispersionconf\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.691363 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-ring-data-devices\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.691469 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3323fc69-96ba-4767-aca9-a094ee4511fa-etc-swift\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.691518 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-swiftconf\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.691559 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zd8h\" (UniqueName: \"kubernetes.io/projected/3323fc69-96ba-4767-aca9-a094ee4511fa-kube-api-access-4zd8h\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.693018 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-scripts\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.693533 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3323fc69-96ba-4767-aca9-a094ee4511fa-etc-swift\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.693719 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-ring-data-devices\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.696147 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-swiftconf\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.697128 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-dispersionconf\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.713540 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zd8h\" (UniqueName: \"kubernetes.io/projected/3323fc69-96ba-4767-aca9-a094ee4511fa-kube-api-access-4zd8h\") pod \"swift-ring-rebalance-debug-2n2dh\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:56 crc kubenswrapper[4732]: I0131 09:17:56.812989 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:57 crc kubenswrapper[4732]: I0131 09:17:57.023731 4732 scope.go:117] "RemoveContainer" containerID="3c637fef25b194b9f5bca1b27d2b6a30e6bc6beb823cf5161054f65dd6ad4520" Jan 31 09:17:57 crc kubenswrapper[4732]: I0131 09:17:57.023764 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kqn2b" Jan 31 09:17:57 crc kubenswrapper[4732]: I0131 09:17:57.277827 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh"] Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.034910 4732 generic.go:334] "Generic (PLEG): container finished" podID="3323fc69-96ba-4767-aca9-a094ee4511fa" containerID="65b84d4ebc25368a071a65e4d20d0e365ba1b7010ab100f5e787fd36a34406cf" exitCode=0 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.035031 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" event={"ID":"3323fc69-96ba-4767-aca9-a094ee4511fa","Type":"ContainerDied","Data":"65b84d4ebc25368a071a65e4d20d0e365ba1b7010ab100f5e787fd36a34406cf"} Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.035474 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" event={"ID":"3323fc69-96ba-4767-aca9-a094ee4511fa","Type":"ContainerStarted","Data":"7b29da91d53315893aeacee191e80a9a9ee11a976647a732e826f2cf163e9590"} Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.081381 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh"] Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.089506 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh"] Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.180432 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pdbq2"] Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.194390 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-pdbq2"] Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.205817 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206530 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-expirer" containerID="cri-o://171dd93dab917d5e3ad3010e206ac5f2aa6f0312273716ac344a73726703fbda" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206555 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-server" containerID="cri-o://f59d6e8e79e5f65eaab0514cd4a4f2da1508f44198f23637971d0e31f2b0f2be" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206579 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-auditor" containerID="cri-o://17809bd753b75791dd2fa65451f79147382e0f41d6052d606f342c4037233a9a" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206609 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-server" containerID="cri-o://0630d2ca7e09803aac756e9568f7669b86f328b502c8fbec24c1ab333009da4d" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206699 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-updater" containerID="cri-o://1839639d5b9952c70ff36e7c3b972e786bbfb15314211dc0fe15a6700ac4d580" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206723 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-replicator" containerID="cri-o://0939cdb203f97f24dcd9e0d0769010bdb9148423fa723be6da835b8c3fcd94a3" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206759 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-updater" containerID="cri-o://6c6f43f8261cf2448edf43a43564636ba04ed0b128e0df16f2584d3972a973a9" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206802 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="swift-recon-cron" containerID="cri-o://29d6915d3be45a209958c78f32e6cbfa5546c65adc3eb566381972012beea587" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206814 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-auditor" containerID="cri-o://b26aaf7029934891de5d0e92f8463ca1f8e106c401d898590351656c24bbc6e8" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206827 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-replicator" containerID="cri-o://ecfb0c165d8f537d03e0e64cf12e24a1575f070da3670ea4769b182ee59dcc20" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206859 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="rsync" containerID="cri-o://2038dec3545765c898c51198c887beb686d603d6257cd634255dc773c64b0920" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206869 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-auditor" containerID="cri-o://069a866bbc2edc4e0416ac1351e656e79b93160d557be633a57aa05933c9d37a" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206880 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-reaper" containerID="cri-o://f72f8dedea2940a6aab6b6081a6ac32f4fe8a817a636c7292b3740e840aacaf8" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206891 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-replicator" containerID="cri-o://0eb6a32653dab2879fd0e39ad42f712d86e6b8054d3507176f2d9dfda652811b" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.206931 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-server" containerID="cri-o://ef329846cba5f043851b0fac85462ce13c21ac2e48891f331c0b61745c33b6de" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.243712 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244223 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-server" containerID="cri-o://f47b66d5b1e0a89ad9b955fbf6dde3f0b21e522e86e33f379a1e2379b7919989" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244289 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-server" containerID="cri-o://95aedeb15c9b618726f462f7211b9b97027622f3a2b198dc9717394e81868db5" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244357 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-updater" containerID="cri-o://dcca573b69212cf173983d6090be0a12978f5decb7c01cbd3779f1d4d2a6b0d0" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244413 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-auditor" containerID="cri-o://de878ffff4963739e745def44cb56f453584bcd3f75e8cb3d9b85f0ca8d4e512" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244462 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-replicator" containerID="cri-o://750de579c99e37a98c19df2a11fee436a72f515c72747a3e91a0abc97cd07385" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244502 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-server" containerID="cri-o://3ca9c58e45467d768a6b9a7ce40fa6bc0ba4cec3710eae69487dc0106ba42c67" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244536 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-reaper" containerID="cri-o://f6d112eeaf369a7cae9a04806380ef040d7989006ab740a5b0d945ef5f7f317b" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244573 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-auditor" containerID="cri-o://5d118436256aaad4b93c230d8fa0456e8cc6f3c5ca6ce4c9d7f30f8db3c450ff" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244606 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-replicator" containerID="cri-o://bcb9d265c1a8c03fc26ba3b2f05fa7fcceb6cc59b711a2e0dd620cf846db5469" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244752 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-expirer" containerID="cri-o://bd9e9b782c4457e3bdc1029eec72a89f862eb48d009287030b3d15352e63d945" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244797 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-updater" containerID="cri-o://aba53546aab560ce992f9ed9d7c70a191eee00bcb246cc44c3bcf9dc6c6a964d" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244810 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="swift-recon-cron" containerID="cri-o://07ba04fcdc461974d0c6b09ac9da6b0c6610f36749f50f5eb47e063be68f5291" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244852 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="rsync" containerID="cri-o://3791aa9fdd8ee5b8255c509c99988f621034603e232b3c13e0287a092538c4df" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244875 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-auditor" containerID="cri-o://87b04e76193dc458b5e96db200183f64f7cda5171af7bf59a9b3b47ca7c00582" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.244932 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-replicator" containerID="cri-o://23a3dbdcb5e8c0d9b5e2b66e270ff60b28361fccb19ac39e68496d454544c773" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.264741 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.265298 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-server" containerID="cri-o://61cb3b1d16a1d1d363ee3738835e89a0f2f6d3986bf10918c00e484f7f8c9ef7" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.265751 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="swift-recon-cron" containerID="cri-o://81f782a2d54156b4990bcbb2dab091aeb970934a10ac85e7abc67c813255ffee" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.265807 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="rsync" containerID="cri-o://db1b9074ef0795f714675af65f50c4b87204369e4a7ad86e330547b5ab8f05fb" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.265840 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-expirer" containerID="cri-o://7f3c24b1866708a685d278257ae506a6fd79ff0289067bb9f41d1ad991336cc6" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.265870 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-updater" containerID="cri-o://5e6d65fa1c26ce7ebf7ff25c277b074ca08845044a94dbdd6b32ff300ac68b7e" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.265896 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-auditor" containerID="cri-o://b045d34930a290c72208028dc6d1d6b3535e4b8818d51998647df06b3f7c2bdc" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.265929 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-replicator" containerID="cri-o://288678dcae1157f1c7ebe15c590533bc3accdd196dbe20808122f33180c2092e" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.265970 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-server" containerID="cri-o://7907616b4b6ec30fa97533949de2bee3dfcee1ddac0b222cb83f6c75835a1755" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.266008 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-updater" containerID="cri-o://20b5017b8e043011a79ad2661ea73a50a94d52bc115728a4ab154b985c7430df" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.266038 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-auditor" containerID="cri-o://73d8c40ae242fa0366cc532a47eefd15e47b7f5aeeb79eb1c8113e496beaf6e2" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.266065 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-replicator" containerID="cri-o://2ab9eec16ab6fbc15e432a8ef644dbb519f0a4dd256590fb6f9676ffceba4611" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.266094 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-server" containerID="cri-o://6a55e95bf79b05700a43bb7800d35ec7f862e81feb1caac0721521392f5f8e7f" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.266121 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-reaper" containerID="cri-o://7f58b9ade96eae3a4c9b405f187c769232309e0c1fb241bf5444c4c304ab649f" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.266150 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-auditor" containerID="cri-o://dbfd2c11334695f732524b533be449dd61ebf7d7637f32b3bcfe1d1941b37863" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.266178 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-replicator" containerID="cri-o://b63fa3de54dc547aaba5e3c717d85f707b81c7144e50e21b9e827db634951724" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.295339 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8"] Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.295558 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-httpd" containerID="cri-o://5603cb6f58751f09a186156f4f600f7ee6585d6eeac0b6ca57c68aa2fa8a4f68" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.297160 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-server" containerID="cri-o://9fc5b88fc650585c4cf7595095cbcdfc58d2b17c1bb6d6f6621c5213c78ab1f0" gracePeriod=30 Jan 31 09:17:58 crc kubenswrapper[4732]: I0131 09:17:58.553384 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c16da3-2938-49d9-b36d-3d71fe0d48f3" path="/var/lib/kubelet/pods/78c16da3-2938-49d9-b36d-3d71fe0d48f3/volumes" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053373 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="bd9e9b782c4457e3bdc1029eec72a89f862eb48d009287030b3d15352e63d945" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053409 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="aba53546aab560ce992f9ed9d7c70a191eee00bcb246cc44c3bcf9dc6c6a964d" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053421 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="87b04e76193dc458b5e96db200183f64f7cda5171af7bf59a9b3b47ca7c00582" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053431 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="23a3dbdcb5e8c0d9b5e2b66e270ff60b28361fccb19ac39e68496d454544c773" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053439 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="dcca573b69212cf173983d6090be0a12978f5decb7c01cbd3779f1d4d2a6b0d0" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053448 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="de878ffff4963739e745def44cb56f453584bcd3f75e8cb3d9b85f0ca8d4e512" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053460 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="750de579c99e37a98c19df2a11fee436a72f515c72747a3e91a0abc97cd07385" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053470 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="f6d112eeaf369a7cae9a04806380ef040d7989006ab740a5b0d945ef5f7f317b" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053479 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="5d118436256aaad4b93c230d8fa0456e8cc6f3c5ca6ce4c9d7f30f8db3c450ff" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053487 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="bcb9d265c1a8c03fc26ba3b2f05fa7fcceb6cc59b711a2e0dd620cf846db5469" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053497 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="f47b66d5b1e0a89ad9b955fbf6dde3f0b21e522e86e33f379a1e2379b7919989" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053453 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"bd9e9b782c4457e3bdc1029eec72a89f862eb48d009287030b3d15352e63d945"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053568 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"aba53546aab560ce992f9ed9d7c70a191eee00bcb246cc44c3bcf9dc6c6a964d"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053584 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"87b04e76193dc458b5e96db200183f64f7cda5171af7bf59a9b3b47ca7c00582"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053598 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"23a3dbdcb5e8c0d9b5e2b66e270ff60b28361fccb19ac39e68496d454544c773"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053610 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"dcca573b69212cf173983d6090be0a12978f5decb7c01cbd3779f1d4d2a6b0d0"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053620 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"de878ffff4963739e745def44cb56f453584bcd3f75e8cb3d9b85f0ca8d4e512"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053630 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"750de579c99e37a98c19df2a11fee436a72f515c72747a3e91a0abc97cd07385"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053641 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"f6d112eeaf369a7cae9a04806380ef040d7989006ab740a5b0d945ef5f7f317b"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053651 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"5d118436256aaad4b93c230d8fa0456e8cc6f3c5ca6ce4c9d7f30f8db3c450ff"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053675 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"bcb9d265c1a8c03fc26ba3b2f05fa7fcceb6cc59b711a2e0dd620cf846db5469"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.053687 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"f47b66d5b1e0a89ad9b955fbf6dde3f0b21e522e86e33f379a1e2379b7919989"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061006 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="171dd93dab917d5e3ad3010e206ac5f2aa6f0312273716ac344a73726703fbda" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061055 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="1839639d5b9952c70ff36e7c3b972e786bbfb15314211dc0fe15a6700ac4d580" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061070 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="17809bd753b75791dd2fa65451f79147382e0f41d6052d606f342c4037233a9a" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061084 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="0939cdb203f97f24dcd9e0d0769010bdb9148423fa723be6da835b8c3fcd94a3" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061096 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="6c6f43f8261cf2448edf43a43564636ba04ed0b128e0df16f2584d3972a973a9" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061108 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="b26aaf7029934891de5d0e92f8463ca1f8e106c401d898590351656c24bbc6e8" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061121 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="ecfb0c165d8f537d03e0e64cf12e24a1575f070da3670ea4769b182ee59dcc20" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061134 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="0630d2ca7e09803aac756e9568f7669b86f328b502c8fbec24c1ab333009da4d" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061146 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="f72f8dedea2940a6aab6b6081a6ac32f4fe8a817a636c7292b3740e840aacaf8" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061158 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="069a866bbc2edc4e0416ac1351e656e79b93160d557be633a57aa05933c9d37a" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061170 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="0eb6a32653dab2879fd0e39ad42f712d86e6b8054d3507176f2d9dfda652811b" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061234 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"171dd93dab917d5e3ad3010e206ac5f2aa6f0312273716ac344a73726703fbda"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061272 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"1839639d5b9952c70ff36e7c3b972e786bbfb15314211dc0fe15a6700ac4d580"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061292 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"17809bd753b75791dd2fa65451f79147382e0f41d6052d606f342c4037233a9a"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061309 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"0939cdb203f97f24dcd9e0d0769010bdb9148423fa723be6da835b8c3fcd94a3"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061327 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"6c6f43f8261cf2448edf43a43564636ba04ed0b128e0df16f2584d3972a973a9"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061346 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"b26aaf7029934891de5d0e92f8463ca1f8e106c401d898590351656c24bbc6e8"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061363 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"ecfb0c165d8f537d03e0e64cf12e24a1575f070da3670ea4769b182ee59dcc20"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"0630d2ca7e09803aac756e9568f7669b86f328b502c8fbec24c1ab333009da4d"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061399 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"f72f8dedea2940a6aab6b6081a6ac32f4fe8a817a636c7292b3740e840aacaf8"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061416 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"069a866bbc2edc4e0416ac1351e656e79b93160d557be633a57aa05933c9d37a"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.061432 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"0eb6a32653dab2879fd0e39ad42f712d86e6b8054d3507176f2d9dfda652811b"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067185 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="7f3c24b1866708a685d278257ae506a6fd79ff0289067bb9f41d1ad991336cc6" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067211 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="5e6d65fa1c26ce7ebf7ff25c277b074ca08845044a94dbdd6b32ff300ac68b7e" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067225 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="b045d34930a290c72208028dc6d1d6b3535e4b8818d51998647df06b3f7c2bdc" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067241 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="288678dcae1157f1c7ebe15c590533bc3accdd196dbe20808122f33180c2092e" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067253 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="20b5017b8e043011a79ad2661ea73a50a94d52bc115728a4ab154b985c7430df" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067265 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="73d8c40ae242fa0366cc532a47eefd15e47b7f5aeeb79eb1c8113e496beaf6e2" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067280 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="2ab9eec16ab6fbc15e432a8ef644dbb519f0a4dd256590fb6f9676ffceba4611" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067292 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="7f58b9ade96eae3a4c9b405f187c769232309e0c1fb241bf5444c4c304ab649f" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067304 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="dbfd2c11334695f732524b533be449dd61ebf7d7637f32b3bcfe1d1941b37863" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067315 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="b63fa3de54dc547aaba5e3c717d85f707b81c7144e50e21b9e827db634951724" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067371 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"7f3c24b1866708a685d278257ae506a6fd79ff0289067bb9f41d1ad991336cc6"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067399 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"5e6d65fa1c26ce7ebf7ff25c277b074ca08845044a94dbdd6b32ff300ac68b7e"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067418 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"b045d34930a290c72208028dc6d1d6b3535e4b8818d51998647df06b3f7c2bdc"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067435 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"288678dcae1157f1c7ebe15c590533bc3accdd196dbe20808122f33180c2092e"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067452 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"20b5017b8e043011a79ad2661ea73a50a94d52bc115728a4ab154b985c7430df"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067470 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"73d8c40ae242fa0366cc532a47eefd15e47b7f5aeeb79eb1c8113e496beaf6e2"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"2ab9eec16ab6fbc15e432a8ef644dbb519f0a4dd256590fb6f9676ffceba4611"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067502 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"7f58b9ade96eae3a4c9b405f187c769232309e0c1fb241bf5444c4c304ab649f"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067518 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"dbfd2c11334695f732524b533be449dd61ebf7d7637f32b3bcfe1d1941b37863"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.067534 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"b63fa3de54dc547aaba5e3c717d85f707b81c7144e50e21b9e827db634951724"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.068995 4732 generic.go:334] "Generic (PLEG): container finished" podID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerID="5603cb6f58751f09a186156f4f600f7ee6585d6eeac0b6ca57c68aa2fa8a4f68" exitCode=0 Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.069262 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" event={"ID":"9e00e76d-2f89-454c-be3b-855e8186c78e","Type":"ContainerDied","Data":"5603cb6f58751f09a186156f4f600f7ee6585d6eeac0b6ca57c68aa2fa8a4f68"} Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.479889 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.588610 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.100:8080/healthcheck\": dial tcp 10.217.0.100:8080: connect: connection refused" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.588744 4732 prober.go:107] "Probe failed" probeType="Readiness" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-server" probeResult="failure" output="Get \"http://10.217.0.100:8080/healthcheck\": dial tcp 10.217.0.100:8080: connect: connection refused" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.645339 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-swiftconf\") pod \"3323fc69-96ba-4767-aca9-a094ee4511fa\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.645470 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-scripts\") pod \"3323fc69-96ba-4767-aca9-a094ee4511fa\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.645524 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-ring-data-devices\") pod \"3323fc69-96ba-4767-aca9-a094ee4511fa\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.645609 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3323fc69-96ba-4767-aca9-a094ee4511fa-etc-swift\") pod \"3323fc69-96ba-4767-aca9-a094ee4511fa\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.645771 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-dispersionconf\") pod \"3323fc69-96ba-4767-aca9-a094ee4511fa\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.645814 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zd8h\" (UniqueName: \"kubernetes.io/projected/3323fc69-96ba-4767-aca9-a094ee4511fa-kube-api-access-4zd8h\") pod \"3323fc69-96ba-4767-aca9-a094ee4511fa\" (UID: \"3323fc69-96ba-4767-aca9-a094ee4511fa\") " Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.647392 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3323fc69-96ba-4767-aca9-a094ee4511fa" (UID: "3323fc69-96ba-4767-aca9-a094ee4511fa"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.648123 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3323fc69-96ba-4767-aca9-a094ee4511fa-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3323fc69-96ba-4767-aca9-a094ee4511fa" (UID: "3323fc69-96ba-4767-aca9-a094ee4511fa"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.661984 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3323fc69-96ba-4767-aca9-a094ee4511fa-kube-api-access-4zd8h" (OuterVolumeSpecName: "kube-api-access-4zd8h") pod "3323fc69-96ba-4767-aca9-a094ee4511fa" (UID: "3323fc69-96ba-4767-aca9-a094ee4511fa"). InnerVolumeSpecName "kube-api-access-4zd8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.669016 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-scripts" (OuterVolumeSpecName: "scripts") pod "3323fc69-96ba-4767-aca9-a094ee4511fa" (UID: "3323fc69-96ba-4767-aca9-a094ee4511fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.670216 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3323fc69-96ba-4767-aca9-a094ee4511fa" (UID: "3323fc69-96ba-4767-aca9-a094ee4511fa"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.670773 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3323fc69-96ba-4767-aca9-a094ee4511fa" (UID: "3323fc69-96ba-4767-aca9-a094ee4511fa"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.747494 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.747526 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zd8h\" (UniqueName: \"kubernetes.io/projected/3323fc69-96ba-4767-aca9-a094ee4511fa-kube-api-access-4zd8h\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.747539 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3323fc69-96ba-4767-aca9-a094ee4511fa-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.747547 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.747557 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3323fc69-96ba-4767-aca9-a094ee4511fa-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:17:59 crc kubenswrapper[4732]: I0131 09:17:59.747566 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3323fc69-96ba-4767-aca9-a094ee4511fa-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.076940 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2n2dh" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.076960 4732 scope.go:117] "RemoveContainer" containerID="65b84d4ebc25368a071a65e4d20d0e365ba1b7010ab100f5e787fd36a34406cf" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.088045 4732 generic.go:334] "Generic (PLEG): container finished" podID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerID="9fc5b88fc650585c4cf7595095cbcdfc58d2b17c1bb6d6f6621c5213c78ab1f0" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.088122 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" event={"ID":"9e00e76d-2f89-454c-be3b-855e8186c78e","Type":"ContainerDied","Data":"9fc5b88fc650585c4cf7595095cbcdfc58d2b17c1bb6d6f6621c5213c78ab1f0"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.101296 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="3791aa9fdd8ee5b8255c509c99988f621034603e232b3c13e0287a092538c4df" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.101327 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="95aedeb15c9b618726f462f7211b9b97027622f3a2b198dc9717394e81868db5" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.101336 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="3ca9c58e45467d768a6b9a7ce40fa6bc0ba4cec3710eae69487dc0106ba42c67" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.101326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"3791aa9fdd8ee5b8255c509c99988f621034603e232b3c13e0287a092538c4df"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.101369 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"95aedeb15c9b618726f462f7211b9b97027622f3a2b198dc9717394e81868db5"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.101382 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"3ca9c58e45467d768a6b9a7ce40fa6bc0ba4cec3710eae69487dc0106ba42c67"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.128886 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="2038dec3545765c898c51198c887beb686d603d6257cd634255dc773c64b0920" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.128913 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="f59d6e8e79e5f65eaab0514cd4a4f2da1508f44198f23637971d0e31f2b0f2be" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.128922 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="ef329846cba5f043851b0fac85462ce13c21ac2e48891f331c0b61745c33b6de" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.128939 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"2038dec3545765c898c51198c887beb686d603d6257cd634255dc773c64b0920"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.128981 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"f59d6e8e79e5f65eaab0514cd4a4f2da1508f44198f23637971d0e31f2b0f2be"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.128995 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"ef329846cba5f043851b0fac85462ce13c21ac2e48891f331c0b61745c33b6de"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.140611 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="db1b9074ef0795f714675af65f50c4b87204369e4a7ad86e330547b5ab8f05fb" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.140645 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="7907616b4b6ec30fa97533949de2bee3dfcee1ddac0b222cb83f6c75835a1755" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.140656 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="6a55e95bf79b05700a43bb7800d35ec7f862e81feb1caac0721521392f5f8e7f" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.140682 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="61cb3b1d16a1d1d363ee3738835e89a0f2f6d3986bf10918c00e484f7f8c9ef7" exitCode=0 Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.140708 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"db1b9074ef0795f714675af65f50c4b87204369e4a7ad86e330547b5ab8f05fb"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.140738 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"7907616b4b6ec30fa97533949de2bee3dfcee1ddac0b222cb83f6c75835a1755"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.140751 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"6a55e95bf79b05700a43bb7800d35ec7f862e81feb1caac0721521392f5f8e7f"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.140762 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"61cb3b1d16a1d1d363ee3738835e89a0f2f6d3986bf10918c00e484f7f8c9ef7"} Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.400689 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.556219 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") pod \"9e00e76d-2f89-454c-be3b-855e8186c78e\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.556303 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-log-httpd\") pod \"9e00e76d-2f89-454c-be3b-855e8186c78e\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.556405 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k5xz\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-kube-api-access-5k5xz\") pod \"9e00e76d-2f89-454c-be3b-855e8186c78e\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.556432 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e00e76d-2f89-454c-be3b-855e8186c78e-config-data\") pod \"9e00e76d-2f89-454c-be3b-855e8186c78e\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.556471 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-run-httpd\") pod \"9e00e76d-2f89-454c-be3b-855e8186c78e\" (UID: \"9e00e76d-2f89-454c-be3b-855e8186c78e\") " Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.556945 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9e00e76d-2f89-454c-be3b-855e8186c78e" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.557332 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9e00e76d-2f89-454c-be3b-855e8186c78e" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.557579 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3323fc69-96ba-4767-aca9-a094ee4511fa" path="/var/lib/kubelet/pods/3323fc69-96ba-4767-aca9-a094ee4511fa/volumes" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.561815 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9e00e76d-2f89-454c-be3b-855e8186c78e" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.570032 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-kube-api-access-5k5xz" (OuterVolumeSpecName: "kube-api-access-5k5xz") pod "9e00e76d-2f89-454c-be3b-855e8186c78e" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e"). InnerVolumeSpecName "kube-api-access-5k5xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.592344 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e00e76d-2f89-454c-be3b-855e8186c78e-config-data" (OuterVolumeSpecName: "config-data") pod "9e00e76d-2f89-454c-be3b-855e8186c78e" (UID: "9e00e76d-2f89-454c-be3b-855e8186c78e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.659033 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k5xz\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-kube-api-access-5k5xz\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.659120 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e00e76d-2f89-454c-be3b-855e8186c78e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.659144 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.659163 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9e00e76d-2f89-454c-be3b-855e8186c78e-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:00 crc kubenswrapper[4732]: I0131 09:18:00.659183 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e00e76d-2f89-454c-be3b-855e8186c78e-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:01 crc kubenswrapper[4732]: I0131 09:18:01.153302 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" event={"ID":"9e00e76d-2f89-454c-be3b-855e8186c78e","Type":"ContainerDied","Data":"70a5df12fcc55bf7e0349357dc9e7a70341a9c65f4cf5241e077476ae04d6820"} Jan 31 09:18:01 crc kubenswrapper[4732]: I0131 09:18:01.153754 4732 scope.go:117] "RemoveContainer" containerID="9fc5b88fc650585c4cf7595095cbcdfc58d2b17c1bb6d6f6621c5213c78ab1f0" Jan 31 09:18:01 crc kubenswrapper[4732]: I0131 09:18:01.153391 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8" Jan 31 09:18:01 crc kubenswrapper[4732]: I0131 09:18:01.177266 4732 scope.go:117] "RemoveContainer" containerID="5603cb6f58751f09a186156f4f600f7ee6585d6eeac0b6ca57c68aa2fa8a4f68" Jan 31 09:18:01 crc kubenswrapper[4732]: I0131 09:18:01.195886 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8"] Jan 31 09:18:01 crc kubenswrapper[4732]: I0131 09:18:01.208158 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-f2jx8"] Jan 31 09:18:02 crc kubenswrapper[4732]: I0131 09:18:02.555818 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" path="/var/lib/kubelet/pods/9e00e76d-2f89-454c-be3b-855e8186c78e/volumes" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.435829 4732 generic.go:334] "Generic (PLEG): container finished" podID="410ee08c-4c6c-4012-aa46-264179923617" containerID="81f782a2d54156b4990bcbb2dab091aeb970934a10ac85e7abc67c813255ffee" exitCode=137 Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.435872 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"81f782a2d54156b4990bcbb2dab091aeb970934a10ac85e7abc67c813255ffee"} Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.450193 4732 generic.go:334] "Generic (PLEG): container finished" podID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerID="07ba04fcdc461974d0c6b09ac9da6b0c6610f36749f50f5eb47e063be68f5291" exitCode=137 Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.450247 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"07ba04fcdc461974d0c6b09ac9da6b0c6610f36749f50f5eb47e063be68f5291"} Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.468725 4732 generic.go:334] "Generic (PLEG): container finished" podID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerID="29d6915d3be45a209958c78f32e6cbfa5546c65adc3eb566381972012beea587" exitCode=137 Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.468778 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"29d6915d3be45a209958c78f32e6cbfa5546c65adc3eb566381972012beea587"} Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.731301 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.736575 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.739938 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792587 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-etc-swift\") pod \"20196d3e-600c-4a25-97ef-86f81bfae43b\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792687 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-cache\") pod \"410ee08c-4c6c-4012-aa46-264179923617\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792716 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-lock\") pod \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792743 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-lock\") pod \"410ee08c-4c6c-4012-aa46-264179923617\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792778 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-etc-swift\") pod \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792825 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt7ch\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-kube-api-access-pt7ch\") pod \"410ee08c-4c6c-4012-aa46-264179923617\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792852 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-lock\") pod \"20196d3e-600c-4a25-97ef-86f81bfae43b\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792877 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-cache\") pod \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792895 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"410ee08c-4c6c-4012-aa46-264179923617\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792940 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792961 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") pod \"410ee08c-4c6c-4012-aa46-264179923617\" (UID: \"410ee08c-4c6c-4012-aa46-264179923617\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.792981 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-cache\") pod \"20196d3e-600c-4a25-97ef-86f81bfae43b\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.793028 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gtpt\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-kube-api-access-5gtpt\") pod \"20196d3e-600c-4a25-97ef-86f81bfae43b\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.793066 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqsrn\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-kube-api-access-kqsrn\") pod \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\" (UID: \"485e2c17-77f1-4b13-ad2a-1afe1034b82e\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.793102 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"20196d3e-600c-4a25-97ef-86f81bfae43b\" (UID: \"20196d3e-600c-4a25-97ef-86f81bfae43b\") " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.795778 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-cache" (OuterVolumeSpecName: "cache") pod "485e2c17-77f1-4b13-ad2a-1afe1034b82e" (UID: "485e2c17-77f1-4b13-ad2a-1afe1034b82e"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.796777 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-lock" (OuterVolumeSpecName: "lock") pod "485e2c17-77f1-4b13-ad2a-1afe1034b82e" (UID: "485e2c17-77f1-4b13-ad2a-1afe1034b82e"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.796884 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-cache" (OuterVolumeSpecName: "cache") pod "20196d3e-600c-4a25-97ef-86f81bfae43b" (UID: "20196d3e-600c-4a25-97ef-86f81bfae43b"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.799108 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-cache" (OuterVolumeSpecName: "cache") pod "410ee08c-4c6c-4012-aa46-264179923617" (UID: "410ee08c-4c6c-4012-aa46-264179923617"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.799531 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-lock" (OuterVolumeSpecName: "lock") pod "20196d3e-600c-4a25-97ef-86f81bfae43b" (UID: "20196d3e-600c-4a25-97ef-86f81bfae43b"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.799905 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-lock" (OuterVolumeSpecName: "lock") pod "410ee08c-4c6c-4012-aa46-264179923617" (UID: "410ee08c-4c6c-4012-aa46-264179923617"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.800456 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "410ee08c-4c6c-4012-aa46-264179923617" (UID: "410ee08c-4c6c-4012-aa46-264179923617"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.800555 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "20196d3e-600c-4a25-97ef-86f81bfae43b" (UID: "20196d3e-600c-4a25-97ef-86f81bfae43b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.800739 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "swift") pod "20196d3e-600c-4a25-97ef-86f81bfae43b" (UID: "20196d3e-600c-4a25-97ef-86f81bfae43b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.802624 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "485e2c17-77f1-4b13-ad2a-1afe1034b82e" (UID: "485e2c17-77f1-4b13-ad2a-1afe1034b82e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.803450 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "485e2c17-77f1-4b13-ad2a-1afe1034b82e" (UID: "485e2c17-77f1-4b13-ad2a-1afe1034b82e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.806107 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "410ee08c-4c6c-4012-aa46-264179923617" (UID: "410ee08c-4c6c-4012-aa46-264179923617"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.807149 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-kube-api-access-pt7ch" (OuterVolumeSpecName: "kube-api-access-pt7ch") pod "410ee08c-4c6c-4012-aa46-264179923617" (UID: "410ee08c-4c6c-4012-aa46-264179923617"). InnerVolumeSpecName "kube-api-access-pt7ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.808824 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-kube-api-access-kqsrn" (OuterVolumeSpecName: "kube-api-access-kqsrn") pod "485e2c17-77f1-4b13-ad2a-1afe1034b82e" (UID: "485e2c17-77f1-4b13-ad2a-1afe1034b82e"). InnerVolumeSpecName "kube-api-access-kqsrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.812866 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-kube-api-access-5gtpt" (OuterVolumeSpecName: "kube-api-access-5gtpt") pod "20196d3e-600c-4a25-97ef-86f81bfae43b" (UID: "20196d3e-600c-4a25-97ef-86f81bfae43b"). InnerVolumeSpecName "kube-api-access-5gtpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894320 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894363 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-cache\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894373 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894381 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/410ee08c-4c6c-4012-aa46-264179923617-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894390 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894399 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt7ch\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-kube-api-access-pt7ch\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894411 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894419 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/485e2c17-77f1-4b13-ad2a-1afe1034b82e-cache\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894451 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894464 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894472 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/410ee08c-4c6c-4012-aa46-264179923617-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894481 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/20196d3e-600c-4a25-97ef-86f81bfae43b-cache\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894489 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gtpt\" (UniqueName: \"kubernetes.io/projected/20196d3e-600c-4a25-97ef-86f81bfae43b-kube-api-access-5gtpt\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894498 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqsrn\" (UniqueName: \"kubernetes.io/projected/485e2c17-77f1-4b13-ad2a-1afe1034b82e-kube-api-access-kqsrn\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.894511 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.906973 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.908712 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.909710 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.996500 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.996540 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:28 crc kubenswrapper[4732]: I0131 09:18:28.996549 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.492838 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"20196d3e-600c-4a25-97ef-86f81bfae43b","Type":"ContainerDied","Data":"87f5d0b315407f4d9f600594a48f694f95c4330a184853c9d3591930eb2df4a1"} Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.492901 4732 scope.go:117] "RemoveContainer" containerID="07ba04fcdc461974d0c6b09ac9da6b0c6610f36749f50f5eb47e063be68f5291" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.493085 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.507626 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.507895 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"485e2c17-77f1-4b13-ad2a-1afe1034b82e","Type":"ContainerDied","Data":"74e38e927b7f9f5222a587afffa38799ef87d240853e804c1d4e908457454209"} Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.518540 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"410ee08c-4c6c-4012-aa46-264179923617","Type":"ContainerDied","Data":"bd56bd5520be61343a090a3833bf69ab0722a78470f84b63a8f6a8b06d85cd3e"} Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.518722 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.519740 4732 scope.go:117] "RemoveContainer" containerID="3791aa9fdd8ee5b8255c509c99988f621034603e232b3c13e0287a092538c4df" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.537320 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.544615 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.547947 4732 scope.go:117] "RemoveContainer" containerID="bd9e9b782c4457e3bdc1029eec72a89f862eb48d009287030b3d15352e63d945" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.573081 4732 scope.go:117] "RemoveContainer" containerID="aba53546aab560ce992f9ed9d7c70a191eee00bcb246cc44c3bcf9dc6c6a964d" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.586366 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.593219 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.593985 4732 scope.go:117] "RemoveContainer" containerID="87b04e76193dc458b5e96db200183f64f7cda5171af7bf59a9b3b47ca7c00582" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.600008 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.606633 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.610735 4732 scope.go:117] "RemoveContainer" containerID="23a3dbdcb5e8c0d9b5e2b66e270ff60b28361fccb19ac39e68496d454544c773" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.624819 4732 scope.go:117] "RemoveContainer" containerID="95aedeb15c9b618726f462f7211b9b97027622f3a2b198dc9717394e81868db5" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.638750 4732 scope.go:117] "RemoveContainer" containerID="dcca573b69212cf173983d6090be0a12978f5decb7c01cbd3779f1d4d2a6b0d0" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.659281 4732 scope.go:117] "RemoveContainer" containerID="de878ffff4963739e745def44cb56f453584bcd3f75e8cb3d9b85f0ca8d4e512" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.674277 4732 scope.go:117] "RemoveContainer" containerID="750de579c99e37a98c19df2a11fee436a72f515c72747a3e91a0abc97cd07385" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.687316 4732 scope.go:117] "RemoveContainer" containerID="3ca9c58e45467d768a6b9a7ce40fa6bc0ba4cec3710eae69487dc0106ba42c67" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.699768 4732 scope.go:117] "RemoveContainer" containerID="f6d112eeaf369a7cae9a04806380ef040d7989006ab740a5b0d945ef5f7f317b" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.720945 4732 scope.go:117] "RemoveContainer" containerID="5d118436256aaad4b93c230d8fa0456e8cc6f3c5ca6ce4c9d7f30f8db3c450ff" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.736337 4732 scope.go:117] "RemoveContainer" containerID="bcb9d265c1a8c03fc26ba3b2f05fa7fcceb6cc59b711a2e0dd620cf846db5469" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.752118 4732 scope.go:117] "RemoveContainer" containerID="f47b66d5b1e0a89ad9b955fbf6dde3f0b21e522e86e33f379a1e2379b7919989" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.767507 4732 scope.go:117] "RemoveContainer" containerID="29d6915d3be45a209958c78f32e6cbfa5546c65adc3eb566381972012beea587" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.783984 4732 scope.go:117] "RemoveContainer" containerID="2038dec3545765c898c51198c887beb686d603d6257cd634255dc773c64b0920" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.799400 4732 scope.go:117] "RemoveContainer" containerID="171dd93dab917d5e3ad3010e206ac5f2aa6f0312273716ac344a73726703fbda" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.812897 4732 scope.go:117] "RemoveContainer" containerID="1839639d5b9952c70ff36e7c3b972e786bbfb15314211dc0fe15a6700ac4d580" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.826205 4732 scope.go:117] "RemoveContainer" containerID="17809bd753b75791dd2fa65451f79147382e0f41d6052d606f342c4037233a9a" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.841728 4732 scope.go:117] "RemoveContainer" containerID="0939cdb203f97f24dcd9e0d0769010bdb9148423fa723be6da835b8c3fcd94a3" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.868084 4732 scope.go:117] "RemoveContainer" containerID="f59d6e8e79e5f65eaab0514cd4a4f2da1508f44198f23637971d0e31f2b0f2be" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.891853 4732 scope.go:117] "RemoveContainer" containerID="6c6f43f8261cf2448edf43a43564636ba04ed0b128e0df16f2584d3972a973a9" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.907053 4732 scope.go:117] "RemoveContainer" containerID="b26aaf7029934891de5d0e92f8463ca1f8e106c401d898590351656c24bbc6e8" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.919874 4732 scope.go:117] "RemoveContainer" containerID="ecfb0c165d8f537d03e0e64cf12e24a1575f070da3670ea4769b182ee59dcc20" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.938827 4732 scope.go:117] "RemoveContainer" containerID="0630d2ca7e09803aac756e9568f7669b86f328b502c8fbec24c1ab333009da4d" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.955439 4732 scope.go:117] "RemoveContainer" containerID="f72f8dedea2940a6aab6b6081a6ac32f4fe8a817a636c7292b3740e840aacaf8" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.974172 4732 scope.go:117] "RemoveContainer" containerID="069a866bbc2edc4e0416ac1351e656e79b93160d557be633a57aa05933c9d37a" Jan 31 09:18:29 crc kubenswrapper[4732]: I0131 09:18:29.990202 4732 scope.go:117] "RemoveContainer" containerID="0eb6a32653dab2879fd0e39ad42f712d86e6b8054d3507176f2d9dfda652811b" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.008298 4732 scope.go:117] "RemoveContainer" containerID="ef329846cba5f043851b0fac85462ce13c21ac2e48891f331c0b61745c33b6de" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.025153 4732 scope.go:117] "RemoveContainer" containerID="81f782a2d54156b4990bcbb2dab091aeb970934a10ac85e7abc67c813255ffee" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.041368 4732 scope.go:117] "RemoveContainer" containerID="db1b9074ef0795f714675af65f50c4b87204369e4a7ad86e330547b5ab8f05fb" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.099018 4732 scope.go:117] "RemoveContainer" containerID="7f3c24b1866708a685d278257ae506a6fd79ff0289067bb9f41d1ad991336cc6" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.156849 4732 scope.go:117] "RemoveContainer" containerID="5e6d65fa1c26ce7ebf7ff25c277b074ca08845044a94dbdd6b32ff300ac68b7e" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.189856 4732 scope.go:117] "RemoveContainer" containerID="b045d34930a290c72208028dc6d1d6b3535e4b8818d51998647df06b3f7c2bdc" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.207793 4732 scope.go:117] "RemoveContainer" containerID="288678dcae1157f1c7ebe15c590533bc3accdd196dbe20808122f33180c2092e" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.222323 4732 scope.go:117] "RemoveContainer" containerID="7907616b4b6ec30fa97533949de2bee3dfcee1ddac0b222cb83f6c75835a1755" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.237209 4732 scope.go:117] "RemoveContainer" containerID="20b5017b8e043011a79ad2661ea73a50a94d52bc115728a4ab154b985c7430df" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.254535 4732 scope.go:117] "RemoveContainer" containerID="73d8c40ae242fa0366cc532a47eefd15e47b7f5aeeb79eb1c8113e496beaf6e2" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.270147 4732 scope.go:117] "RemoveContainer" containerID="2ab9eec16ab6fbc15e432a8ef644dbb519f0a4dd256590fb6f9676ffceba4611" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.283625 4732 scope.go:117] "RemoveContainer" containerID="6a55e95bf79b05700a43bb7800d35ec7f862e81feb1caac0721521392f5f8e7f" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.300628 4732 scope.go:117] "RemoveContainer" containerID="7f58b9ade96eae3a4c9b405f187c769232309e0c1fb241bf5444c4c304ab649f" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.316891 4732 scope.go:117] "RemoveContainer" containerID="dbfd2c11334695f732524b533be449dd61ebf7d7637f32b3bcfe1d1941b37863" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.346271 4732 scope.go:117] "RemoveContainer" containerID="b63fa3de54dc547aaba5e3c717d85f707b81c7144e50e21b9e827db634951724" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.362798 4732 scope.go:117] "RemoveContainer" containerID="61cb3b1d16a1d1d363ee3738835e89a0f2f6d3986bf10918c00e484f7f8c9ef7" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.555605 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" path="/var/lib/kubelet/pods/20196d3e-600c-4a25-97ef-86f81bfae43b/volumes" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.558201 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="410ee08c-4c6c-4012-aa46-264179923617" path="/var/lib/kubelet/pods/410ee08c-4c6c-4012-aa46-264179923617/volumes" Jan 31 09:18:30 crc kubenswrapper[4732]: I0131 09:18:30.560909 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" path="/var/lib/kubelet/pods/485e2c17-77f1-4b13-ad2a-1afe1034b82e/volumes" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456018 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456632 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456644 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456653 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456659 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456684 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456691 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456699 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456704 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456712 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456717 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456725 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="swift-recon-cron" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456731 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="swift-recon-cron" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456745 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="swift-recon-cron" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456751 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="swift-recon-cron" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456762 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456768 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456778 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456784 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456795 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456800 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456820 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456825 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456835 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-reaper" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456841 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-reaper" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456848 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456853 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456868 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="rsync" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456874 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="rsync" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456880 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456885 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456892 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456897 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456908 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456914 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456925 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456931 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456939 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456945 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456953 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-expirer" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456958 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-expirer" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456968 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-expirer" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456973 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-expirer" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456983 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.456989 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.456997 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457002 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457011 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3323fc69-96ba-4767-aca9-a094ee4511fa" containerName="swift-ring-rebalance" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457017 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="3323fc69-96ba-4767-aca9-a094ee4511fa" containerName="swift-ring-rebalance" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457027 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457032 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457040 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-reaper" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457047 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-reaper" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457055 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457060 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457070 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457075 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457084 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457090 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457100 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457106 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457113 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-expirer" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457118 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-expirer" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457124 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457130 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457137 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-httpd" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457143 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-httpd" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457152 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457158 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457165 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457171 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457179 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457185 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457191 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457197 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457205 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457211 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457218 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457224 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457232 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457238 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457247 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="rsync" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457253 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="rsync" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457263 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-reaper" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457269 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-reaper" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457275 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="swift-recon-cron" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457281 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="swift-recon-cron" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457290 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457295 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457305 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457310 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457320 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="rsync" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457325 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="rsync" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457334 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457340 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-server" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.457348 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457353 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457470 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457482 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-expirer" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457493 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457502 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457509 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457517 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-httpd" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457523 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457529 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-expirer" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457539 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457550 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457560 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457567 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457573 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457581 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e00e76d-2f89-454c-be3b-855e8186c78e" containerName="proxy-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457587 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457593 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-reaper" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457600 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457607 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457615 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457621 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="swift-recon-cron" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457629 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="rsync" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457636 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457645 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="object-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457653 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457680 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="object-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457689 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457696 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="swift-recon-cron" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457702 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="swift-recon-cron" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457709 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-reaper" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457714 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457721 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="object-expirer" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457729 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457737 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="rsync" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457743 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457752 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457758 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-reaper" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457767 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-updater" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457774 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457781 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="account-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457789 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457796 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="container-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457804 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457813 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="account-replicator" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457821 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="account-auditor" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457826 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="485e2c17-77f1-4b13-ad2a-1afe1034b82e" containerName="container-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457833 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="20196d3e-600c-4a25-97ef-86f81bfae43b" containerName="rsync" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457839 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="3323fc69-96ba-4767-aca9-a094ee4511fa" containerName="swift-ring-rebalance" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.457846 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="410ee08c-4c6c-4012-aa46-264179923617" containerName="container-server" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.461542 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.463355 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.463516 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-tc45w" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.463619 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.463785 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.482939 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.557961 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.558034 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-lock\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.558073 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fb9\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-kube-api-access-b5fb9\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.558096 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.558116 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-cache\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.658977 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.659089 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-lock\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.659123 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fb9\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-kube-api-access-b5fb9\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.659155 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.659169 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-cache\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.660648 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") device mount path \"/mnt/openstack/pv06\"" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.661967 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.661991 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:18:32 crc kubenswrapper[4732]: E0131 09:18:32.662040 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift podName:1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65 nodeName:}" failed. No retries permitted until 2026-01-31 09:18:33.162021602 +0000 UTC m=+1051.467897826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift") pod "swift-storage-0" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65") : configmap "swift-ring-files" not found Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.662237 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-lock\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.662326 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-cache\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.679932 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fb9\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-kube-api-access-b5fb9\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:32 crc kubenswrapper[4732]: I0131 09:18:32.692081 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:33 crc kubenswrapper[4732]: I0131 09:18:33.167802 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:33 crc kubenswrapper[4732]: E0131 09:18:33.168033 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:18:33 crc kubenswrapper[4732]: E0131 09:18:33.168071 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:18:33 crc kubenswrapper[4732]: E0131 09:18:33.168129 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift podName:1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65 nodeName:}" failed. No retries permitted until 2026-01-31 09:18:34.16811082 +0000 UTC m=+1052.473987024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift") pod "swift-storage-0" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65") : configmap "swift-ring-files" not found Jan 31 09:18:34 crc kubenswrapper[4732]: I0131 09:18:34.181455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:34 crc kubenswrapper[4732]: E0131 09:18:34.181576 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:18:34 crc kubenswrapper[4732]: E0131 09:18:34.181598 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:18:34 crc kubenswrapper[4732]: E0131 09:18:34.181643 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift podName:1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65 nodeName:}" failed. No retries permitted until 2026-01-31 09:18:36.181627808 +0000 UTC m=+1054.487504022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift") pod "swift-storage-0" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65") : configmap "swift-ring-files" not found Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.210590 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:36 crc kubenswrapper[4732]: E0131 09:18:36.210895 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:18:36 crc kubenswrapper[4732]: E0131 09:18:36.211195 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:18:36 crc kubenswrapper[4732]: E0131 09:18:36.211266 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift podName:1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65 nodeName:}" failed. No retries permitted until 2026-01-31 09:18:40.211242953 +0000 UTC m=+1058.517119197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift") pod "swift-storage-0" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65") : configmap "swift-ring-files" not found Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.346561 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc962"] Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.347563 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.350914 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.351183 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.353537 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.357940 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc962"] Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.413657 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-etc-swift\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.413740 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-ring-data-devices\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.413779 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-swiftconf\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.413830 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9fnr\" (UniqueName: \"kubernetes.io/projected/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-kube-api-access-w9fnr\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.413876 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-dispersionconf\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.413923 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-scripts\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.516009 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-dispersionconf\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.516119 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-scripts\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.516202 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-etc-swift\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.516260 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-ring-data-devices\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.516328 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-swiftconf\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.516411 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9fnr\" (UniqueName: \"kubernetes.io/projected/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-kube-api-access-w9fnr\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.517443 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-etc-swift\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.518442 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-scripts\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.518708 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-ring-data-devices\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.526337 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-dispersionconf\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.528348 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-swiftconf\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.537743 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9fnr\" (UniqueName: \"kubernetes.io/projected/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-kube-api-access-w9fnr\") pod \"swift-ring-rebalance-nc962\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.684983 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:36 crc kubenswrapper[4732]: I0131 09:18:36.934023 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc962"] Jan 31 09:18:37 crc kubenswrapper[4732]: I0131 09:18:37.585595 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-nc962" event={"ID":"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72","Type":"ContainerStarted","Data":"4202ea6b721143639a1a8ee464d8d295428596e358fa14506e850632ab62de19"} Jan 31 09:18:37 crc kubenswrapper[4732]: I0131 09:18:37.586037 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-nc962" event={"ID":"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72","Type":"ContainerStarted","Data":"dd29dd54690710629e25eeaca76d8853c1180c1e25413daf08017e46075f113c"} Jan 31 09:18:37 crc kubenswrapper[4732]: I0131 09:18:37.621411 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-nc962" podStartSLOduration=1.6213897720000001 podStartE2EDuration="1.621389772s" podCreationTimestamp="2026-01-31 09:18:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:37.610430711 +0000 UTC m=+1055.916306945" watchObservedRunningTime="2026-01-31 09:18:37.621389772 +0000 UTC m=+1055.927265986" Jan 31 09:18:40 crc kubenswrapper[4732]: I0131 09:18:40.273266 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:40 crc kubenswrapper[4732]: E0131 09:18:40.273550 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:18:40 crc kubenswrapper[4732]: E0131 09:18:40.273621 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:18:40 crc kubenswrapper[4732]: E0131 09:18:40.273817 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift podName:1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65 nodeName:}" failed. No retries permitted until 2026-01-31 09:18:48.273788917 +0000 UTC m=+1066.579665121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift") pod "swift-storage-0" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65") : configmap "swift-ring-files" not found Jan 31 09:18:44 crc kubenswrapper[4732]: I0131 09:18:44.636596 4732 generic.go:334] "Generic (PLEG): container finished" podID="69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" containerID="4202ea6b721143639a1a8ee464d8d295428596e358fa14506e850632ab62de19" exitCode=0 Jan 31 09:18:44 crc kubenswrapper[4732]: I0131 09:18:44.636698 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-nc962" event={"ID":"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72","Type":"ContainerDied","Data":"4202ea6b721143639a1a8ee464d8d295428596e358fa14506e850632ab62de19"} Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.917061 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.979220 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9fnr\" (UniqueName: \"kubernetes.io/projected/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-kube-api-access-w9fnr\") pod \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.979337 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-scripts\") pod \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.979390 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-ring-data-devices\") pod \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.979454 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-dispersionconf\") pod \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.979479 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-swiftconf\") pod \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.979560 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-etc-swift\") pod \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\" (UID: \"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72\") " Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.980261 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" (UID: "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.980957 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" (UID: "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.987869 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-kube-api-access-w9fnr" (OuterVolumeSpecName: "kube-api-access-w9fnr") pod "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" (UID: "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72"). InnerVolumeSpecName "kube-api-access-w9fnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.991816 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" (UID: "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:45 crc kubenswrapper[4732]: I0131 09:18:45.999596 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-scripts" (OuterVolumeSpecName: "scripts") pod "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" (UID: "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.000751 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" (UID: "69e75a6a-a8c3-4ea1-b9a9-418752e2fc72"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.081002 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.081036 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.081046 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9fnr\" (UniqueName: \"kubernetes.io/projected/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-kube-api-access-w9fnr\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.081055 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.081064 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.081072 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.653857 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-nc962" event={"ID":"69e75a6a-a8c3-4ea1-b9a9-418752e2fc72","Type":"ContainerDied","Data":"dd29dd54690710629e25eeaca76d8853c1180c1e25413daf08017e46075f113c"} Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.653906 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd29dd54690710629e25eeaca76d8853c1180c1e25413daf08017e46075f113c" Jan 31 09:18:46 crc kubenswrapper[4732]: I0131 09:18:46.653972 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-nc962" Jan 31 09:18:48 crc kubenswrapper[4732]: I0131 09:18:48.316938 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:48 crc kubenswrapper[4732]: I0131 09:18:48.326537 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") pod \"swift-storage-0\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:48 crc kubenswrapper[4732]: I0131 09:18:48.380495 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:18:48 crc kubenswrapper[4732]: I0131 09:18:48.886803 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:18:49 crc kubenswrapper[4732]: I0131 09:18:49.684938 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a"} Jan 31 09:18:49 crc kubenswrapper[4732]: I0131 09:18:49.685186 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327"} Jan 31 09:18:49 crc kubenswrapper[4732]: I0131 09:18:49.685199 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521"} Jan 31 09:18:49 crc kubenswrapper[4732]: I0131 09:18:49.685209 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f"} Jan 31 09:18:49 crc kubenswrapper[4732]: I0131 09:18:49.685227 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21"} Jan 31 09:18:49 crc kubenswrapper[4732]: I0131 09:18:49.685235 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"34458a4f4a5d5e0c1ee442a308b559938e3621f6dae5595249f026b6e797962d"} Jan 31 09:18:50 crc kubenswrapper[4732]: I0131 09:18:50.703675 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844"} Jan 31 09:18:50 crc kubenswrapper[4732]: I0131 09:18:50.704029 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad"} Jan 31 09:18:50 crc kubenswrapper[4732]: I0131 09:18:50.704914 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849"} Jan 31 09:18:50 crc kubenswrapper[4732]: I0131 09:18:50.704945 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7"} Jan 31 09:18:50 crc kubenswrapper[4732]: I0131 09:18:50.704959 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050"} Jan 31 09:18:50 crc kubenswrapper[4732]: I0131 09:18:50.705217 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef"} Jan 31 09:18:51 crc kubenswrapper[4732]: I0131 09:18:51.716898 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337"} Jan 31 09:18:51 crc kubenswrapper[4732]: I0131 09:18:51.717214 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479"} Jan 31 09:18:51 crc kubenswrapper[4732]: I0131 09:18:51.717229 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff"} Jan 31 09:18:51 crc kubenswrapper[4732]: I0131 09:18:51.717239 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8"} Jan 31 09:18:51 crc kubenswrapper[4732]: I0131 09:18:51.717248 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerStarted","Data":"4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91"} Jan 31 09:18:51 crc kubenswrapper[4732]: I0131 09:18:51.755815 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=20.755796021 podStartE2EDuration="20.755796021s" podCreationTimestamp="2026-01-31 09:18:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:51.750915939 +0000 UTC m=+1070.056792143" watchObservedRunningTime="2026-01-31 09:18:51.755796021 +0000 UTC m=+1070.061672215" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.827057 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq"] Jan 31 09:18:57 crc kubenswrapper[4732]: E0131 09:18:57.827781 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" containerName="swift-ring-rebalance" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.827795 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" containerName="swift-ring-rebalance" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.827918 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" containerName="swift-ring-rebalance" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.828558 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.831099 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.856925 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq"] Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.971112 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4pgj\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-kube-api-access-c4pgj\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.971211 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-log-httpd\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.971398 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-run-httpd\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.971463 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-config-data\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:57 crc kubenswrapper[4732]: I0131 09:18:57.971519 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-etc-swift\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.072828 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-config-data\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.072917 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-etc-swift\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.072978 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4pgj\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-kube-api-access-c4pgj\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.073008 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-log-httpd\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.073089 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-run-httpd\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.073742 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-run-httpd\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.073832 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-log-httpd\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.078615 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-config-data\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.078682 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-etc-swift\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.099069 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4pgj\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-kube-api-access-c4pgj\") pod \"swift-proxy-59dd5dbb7c-lcxsq\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.155277 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.567724 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq"] Jan 31 09:18:58 crc kubenswrapper[4732]: I0131 09:18:58.790726 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" event={"ID":"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9","Type":"ContainerStarted","Data":"84936fd9e60dfd8bff06c859b11fa4812f19d8ba83da60cc1b9233b2901d5e51"} Jan 31 09:18:59 crc kubenswrapper[4732]: I0131 09:18:59.802547 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" event={"ID":"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9","Type":"ContainerStarted","Data":"14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628"} Jan 31 09:18:59 crc kubenswrapper[4732]: I0131 09:18:59.802904 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:18:59 crc kubenswrapper[4732]: I0131 09:18:59.802917 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" event={"ID":"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9","Type":"ContainerStarted","Data":"7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c"} Jan 31 09:18:59 crc kubenswrapper[4732]: I0131 09:18:59.842178 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" podStartSLOduration=2.842152673 podStartE2EDuration="2.842152673s" podCreationTimestamp="2026-01-31 09:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:18:59.833150183 +0000 UTC m=+1078.139026387" watchObservedRunningTime="2026-01-31 09:18:59.842152673 +0000 UTC m=+1078.148028877" Jan 31 09:19:00 crc kubenswrapper[4732]: I0131 09:19:00.809119 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:19:03 crc kubenswrapper[4732]: I0131 09:19:03.161890 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:19:08 crc kubenswrapper[4732]: I0131 09:19:08.158413 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.808930 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p42tv"] Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.830657 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.833773 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.833828 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.838519 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p42tv"] Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.948541 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmz5c\" (UniqueName: \"kubernetes.io/projected/71a2159f-3ae8-40c1-8404-a58106192d87-kube-api-access-mmz5c\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.948617 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-ring-data-devices\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.948648 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a2159f-3ae8-40c1-8404-a58106192d87-etc-swift\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.948686 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-scripts\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.948705 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-dispersionconf\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:09 crc kubenswrapper[4732]: I0131 09:19:09.948732 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-swiftconf\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.050122 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmz5c\" (UniqueName: \"kubernetes.io/projected/71a2159f-3ae8-40c1-8404-a58106192d87-kube-api-access-mmz5c\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.050217 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-ring-data-devices\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.050285 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a2159f-3ae8-40c1-8404-a58106192d87-etc-swift\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.050318 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-scripts\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.050341 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-dispersionconf\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.050381 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-swiftconf\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.050812 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a2159f-3ae8-40c1-8404-a58106192d87-etc-swift\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.051120 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-ring-data-devices\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.051241 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-scripts\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.056478 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-dispersionconf\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.059232 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-swiftconf\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.072636 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmz5c\" (UniqueName: \"kubernetes.io/projected/71a2159f-3ae8-40c1-8404-a58106192d87-kube-api-access-mmz5c\") pod \"swift-ring-rebalance-debug-p42tv\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.198517 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.606801 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p42tv"] Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.885648 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" event={"ID":"71a2159f-3ae8-40c1-8404-a58106192d87","Type":"ContainerStarted","Data":"1fdd9ffe15b68f781c0060a8dc3d52561429e6211de1a33c677c7ae5bc16192d"} Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.885710 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" event={"ID":"71a2159f-3ae8-40c1-8404-a58106192d87","Type":"ContainerStarted","Data":"1a981eb19d5e56073ba342ee44e1dc63200aae3ea6637c97ff5969c7277df29e"} Jan 31 09:19:10 crc kubenswrapper[4732]: I0131 09:19:10.905602 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" podStartSLOduration=1.905584253 podStartE2EDuration="1.905584253s" podCreationTimestamp="2026-01-31 09:19:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:10.897843552 +0000 UTC m=+1089.203719776" watchObservedRunningTime="2026-01-31 09:19:10.905584253 +0000 UTC m=+1089.211460477" Jan 31 09:19:12 crc kubenswrapper[4732]: I0131 09:19:12.904897 4732 generic.go:334] "Generic (PLEG): container finished" podID="71a2159f-3ae8-40c1-8404-a58106192d87" containerID="1fdd9ffe15b68f781c0060a8dc3d52561429e6211de1a33c677c7ae5bc16192d" exitCode=0 Jan 31 09:19:12 crc kubenswrapper[4732]: I0131 09:19:12.905290 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" event={"ID":"71a2159f-3ae8-40c1-8404-a58106192d87","Type":"ContainerDied","Data":"1fdd9ffe15b68f781c0060a8dc3d52561429e6211de1a33c677c7ae5bc16192d"} Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.208131 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.260878 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p42tv"] Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.269164 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-p42tv"] Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.311603 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a2159f-3ae8-40c1-8404-a58106192d87-etc-swift\") pod \"71a2159f-3ae8-40c1-8404-a58106192d87\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.311678 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmz5c\" (UniqueName: \"kubernetes.io/projected/71a2159f-3ae8-40c1-8404-a58106192d87-kube-api-access-mmz5c\") pod \"71a2159f-3ae8-40c1-8404-a58106192d87\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.311702 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-dispersionconf\") pod \"71a2159f-3ae8-40c1-8404-a58106192d87\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.311730 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-scripts\") pod \"71a2159f-3ae8-40c1-8404-a58106192d87\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.311763 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-ring-data-devices\") pod \"71a2159f-3ae8-40c1-8404-a58106192d87\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.311878 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-swiftconf\") pod \"71a2159f-3ae8-40c1-8404-a58106192d87\" (UID: \"71a2159f-3ae8-40c1-8404-a58106192d87\") " Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.315155 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "71a2159f-3ae8-40c1-8404-a58106192d87" (UID: "71a2159f-3ae8-40c1-8404-a58106192d87"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.315966 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a2159f-3ae8-40c1-8404-a58106192d87-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "71a2159f-3ae8-40c1-8404-a58106192d87" (UID: "71a2159f-3ae8-40c1-8404-a58106192d87"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.320585 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a2159f-3ae8-40c1-8404-a58106192d87-kube-api-access-mmz5c" (OuterVolumeSpecName: "kube-api-access-mmz5c") pod "71a2159f-3ae8-40c1-8404-a58106192d87" (UID: "71a2159f-3ae8-40c1-8404-a58106192d87"). InnerVolumeSpecName "kube-api-access-mmz5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.334347 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-scripts" (OuterVolumeSpecName: "scripts") pod "71a2159f-3ae8-40c1-8404-a58106192d87" (UID: "71a2159f-3ae8-40c1-8404-a58106192d87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.335343 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "71a2159f-3ae8-40c1-8404-a58106192d87" (UID: "71a2159f-3ae8-40c1-8404-a58106192d87"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.335429 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "71a2159f-3ae8-40c1-8404-a58106192d87" (UID: "71a2159f-3ae8-40c1-8404-a58106192d87"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.414216 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.414263 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/71a2159f-3ae8-40c1-8404-a58106192d87-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.414282 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmz5c\" (UniqueName: \"kubernetes.io/projected/71a2159f-3ae8-40c1-8404-a58106192d87-kube-api-access-mmz5c\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.414303 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/71a2159f-3ae8-40c1-8404-a58106192d87-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.414320 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.414335 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/71a2159f-3ae8-40c1-8404-a58106192d87-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.432849 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq"] Jan 31 09:19:14 crc kubenswrapper[4732]: E0131 09:19:14.433099 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a2159f-3ae8-40c1-8404-a58106192d87" containerName="swift-ring-rebalance" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.433110 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a2159f-3ae8-40c1-8404-a58106192d87" containerName="swift-ring-rebalance" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.433254 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a2159f-3ae8-40c1-8404-a58106192d87" containerName="swift-ring-rebalance" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.433779 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.444607 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq"] Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.515153 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-swiftconf\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.515214 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-dispersionconf\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.515287 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfx8r\" (UniqueName: \"kubernetes.io/projected/7d0d843f-2091-43b9-a56c-a6e894f34c6a-kube-api-access-rfx8r\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.515317 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-scripts\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.515339 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0d843f-2091-43b9-a56c-a6e894f34c6a-etc-swift\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.515364 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-ring-data-devices\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.560401 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a2159f-3ae8-40c1-8404-a58106192d87" path="/var/lib/kubelet/pods/71a2159f-3ae8-40c1-8404-a58106192d87/volumes" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.616740 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfx8r\" (UniqueName: \"kubernetes.io/projected/7d0d843f-2091-43b9-a56c-a6e894f34c6a-kube-api-access-rfx8r\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.616800 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-scripts\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.616839 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0d843f-2091-43b9-a56c-a6e894f34c6a-etc-swift\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.616871 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-ring-data-devices\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.616999 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-swiftconf\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.617047 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-dispersionconf\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.617633 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-scripts\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.618473 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0d843f-2091-43b9-a56c-a6e894f34c6a-etc-swift\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.618476 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-ring-data-devices\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.626424 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-dispersionconf\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.633115 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-swiftconf\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.634367 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfx8r\" (UniqueName: \"kubernetes.io/projected/7d0d843f-2091-43b9-a56c-a6e894f34c6a-kube-api-access-rfx8r\") pod \"swift-ring-rebalance-debug-xbvzq\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.763639 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.936486 4732 scope.go:117] "RemoveContainer" containerID="1fdd9ffe15b68f781c0060a8dc3d52561429e6211de1a33c677c7ae5bc16192d" Jan 31 09:19:14 crc kubenswrapper[4732]: I0131 09:19:14.936796 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-p42tv" Jan 31 09:19:15 crc kubenswrapper[4732]: I0131 09:19:15.062235 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq"] Jan 31 09:19:15 crc kubenswrapper[4732]: W0131 09:19:15.073582 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d0d843f_2091_43b9_a56c_a6e894f34c6a.slice/crio-0874ee788cc7a0201916119853527e285b142fd7b705ec112d80d8db9d6f6a97 WatchSource:0}: Error finding container 0874ee788cc7a0201916119853527e285b142fd7b705ec112d80d8db9d6f6a97: Status 404 returned error can't find the container with id 0874ee788cc7a0201916119853527e285b142fd7b705ec112d80d8db9d6f6a97 Jan 31 09:19:15 crc kubenswrapper[4732]: I0131 09:19:15.948651 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" event={"ID":"7d0d843f-2091-43b9-a56c-a6e894f34c6a","Type":"ContainerStarted","Data":"8f7b4935507d9f1d9a228ddbccd77963bc16505fe67a6b62a43d38a59973a92f"} Jan 31 09:19:15 crc kubenswrapper[4732]: I0131 09:19:15.949139 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" event={"ID":"7d0d843f-2091-43b9-a56c-a6e894f34c6a","Type":"ContainerStarted","Data":"0874ee788cc7a0201916119853527e285b142fd7b705ec112d80d8db9d6f6a97"} Jan 31 09:19:15 crc kubenswrapper[4732]: I0131 09:19:15.987088 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" podStartSLOduration=1.987062153 podStartE2EDuration="1.987062153s" podCreationTimestamp="2026-01-31 09:19:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:15.975009188 +0000 UTC m=+1094.280885432" watchObservedRunningTime="2026-01-31 09:19:15.987062153 +0000 UTC m=+1094.292938397" Jan 31 09:19:16 crc kubenswrapper[4732]: I0131 09:19:16.961886 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d0d843f-2091-43b9-a56c-a6e894f34c6a" containerID="8f7b4935507d9f1d9a228ddbccd77963bc16505fe67a6b62a43d38a59973a92f" exitCode=0 Jan 31 09:19:16 crc kubenswrapper[4732]: I0131 09:19:16.962027 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" event={"ID":"7d0d843f-2091-43b9-a56c-a6e894f34c6a","Type":"ContainerDied","Data":"8f7b4935507d9f1d9a228ddbccd77963bc16505fe67a6b62a43d38a59973a92f"} Jan 31 09:19:17 crc kubenswrapper[4732]: I0131 09:19:17.498165 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:19:17 crc kubenswrapper[4732]: I0131 09:19:17.498228 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.294979 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.342933 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq"] Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.356010 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq"] Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.370394 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-ring-data-devices\") pod \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.370452 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-dispersionconf\") pod \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.370577 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-swiftconf\") pod \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.370623 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0d843f-2091-43b9-a56c-a6e894f34c6a-etc-swift\") pod \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.370689 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-scripts\") pod \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.370791 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfx8r\" (UniqueName: \"kubernetes.io/projected/7d0d843f-2091-43b9-a56c-a6e894f34c6a-kube-api-access-rfx8r\") pod \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\" (UID: \"7d0d843f-2091-43b9-a56c-a6e894f34c6a\") " Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.371189 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7d0d843f-2091-43b9-a56c-a6e894f34c6a" (UID: "7d0d843f-2091-43b9-a56c-a6e894f34c6a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.371970 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d0d843f-2091-43b9-a56c-a6e894f34c6a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7d0d843f-2091-43b9-a56c-a6e894f34c6a" (UID: "7d0d843f-2091-43b9-a56c-a6e894f34c6a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.375646 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0d843f-2091-43b9-a56c-a6e894f34c6a-kube-api-access-rfx8r" (OuterVolumeSpecName: "kube-api-access-rfx8r") pod "7d0d843f-2091-43b9-a56c-a6e894f34c6a" (UID: "7d0d843f-2091-43b9-a56c-a6e894f34c6a"). InnerVolumeSpecName "kube-api-access-rfx8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.389022 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7d0d843f-2091-43b9-a56c-a6e894f34c6a" (UID: "7d0d843f-2091-43b9-a56c-a6e894f34c6a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.401022 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-scripts" (OuterVolumeSpecName: "scripts") pod "7d0d843f-2091-43b9-a56c-a6e894f34c6a" (UID: "7d0d843f-2091-43b9-a56c-a6e894f34c6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.405849 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7d0d843f-2091-43b9-a56c-a6e894f34c6a" (UID: "7d0d843f-2091-43b9-a56c-a6e894f34c6a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.473171 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfx8r\" (UniqueName: \"kubernetes.io/projected/7d0d843f-2091-43b9-a56c-a6e894f34c6a-kube-api-access-rfx8r\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.473327 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.473357 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.473382 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d0d843f-2091-43b9-a56c-a6e894f34c6a-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.473405 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d0d843f-2091-43b9-a56c-a6e894f34c6a-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.473423 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d0d843f-2091-43b9-a56c-a6e894f34c6a-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.555829 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d0d843f-2091-43b9-a56c-a6e894f34c6a" path="/var/lib/kubelet/pods/7d0d843f-2091-43b9-a56c-a6e894f34c6a/volumes" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.982925 4732 scope.go:117] "RemoveContainer" containerID="8f7b4935507d9f1d9a228ddbccd77963bc16505fe67a6b62a43d38a59973a92f" Jan 31 09:19:18 crc kubenswrapper[4732]: I0131 09:19:18.982988 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xbvzq" Jan 31 09:19:20 crc kubenswrapper[4732]: I0131 09:19:20.955400 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gblbk"] Jan 31 09:19:20 crc kubenswrapper[4732]: E0131 09:19:20.955797 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0d843f-2091-43b9-a56c-a6e894f34c6a" containerName="swift-ring-rebalance" Jan 31 09:19:20 crc kubenswrapper[4732]: I0131 09:19:20.955816 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0d843f-2091-43b9-a56c-a6e894f34c6a" containerName="swift-ring-rebalance" Jan 31 09:19:20 crc kubenswrapper[4732]: I0131 09:19:20.955974 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0d843f-2091-43b9-a56c-a6e894f34c6a" containerName="swift-ring-rebalance" Jan 31 09:19:20 crc kubenswrapper[4732]: I0131 09:19:20.956544 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:20 crc kubenswrapper[4732]: I0131 09:19:20.959548 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:19:20 crc kubenswrapper[4732]: I0131 09:19:20.959723 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:19:20 crc kubenswrapper[4732]: I0131 09:19:20.965906 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gblbk"] Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.111729 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-swiftconf\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.111821 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-scripts\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.111853 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s7fl\" (UniqueName: \"kubernetes.io/projected/38fc93ea-e490-4d30-a742-97b668a286c5-kube-api-access-2s7fl\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.111893 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38fc93ea-e490-4d30-a742-97b668a286c5-etc-swift\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.111954 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-ring-data-devices\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.112405 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-dispersionconf\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.213527 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-ring-data-devices\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.213621 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-dispersionconf\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.213700 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-swiftconf\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.213737 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-scripts\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.213761 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s7fl\" (UniqueName: \"kubernetes.io/projected/38fc93ea-e490-4d30-a742-97b668a286c5-kube-api-access-2s7fl\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.213796 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38fc93ea-e490-4d30-a742-97b668a286c5-etc-swift\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.214588 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38fc93ea-e490-4d30-a742-97b668a286c5-etc-swift\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.214859 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-scripts\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.215295 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-ring-data-devices\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.221461 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-dispersionconf\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.224121 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-swiftconf\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.232098 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s7fl\" (UniqueName: \"kubernetes.io/projected/38fc93ea-e490-4d30-a742-97b668a286c5-kube-api-access-2s7fl\") pod \"swift-ring-rebalance-debug-gblbk\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.287847 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:21 crc kubenswrapper[4732]: I0131 09:19:21.730270 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gblbk"] Jan 31 09:19:22 crc kubenswrapper[4732]: I0131 09:19:22.024105 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" event={"ID":"38fc93ea-e490-4d30-a742-97b668a286c5","Type":"ContainerStarted","Data":"432b5c5eba1b55f33733f64fa26fd00fcd2ddf085c54e19e85cffa05ca5842c9"} Jan 31 09:19:22 crc kubenswrapper[4732]: I0131 09:19:22.024156 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" event={"ID":"38fc93ea-e490-4d30-a742-97b668a286c5","Type":"ContainerStarted","Data":"18daffe26932d061f3f7f163a1c31a80af9593e793c13d3e5a4189a5cdd737d4"} Jan 31 09:19:22 crc kubenswrapper[4732]: I0131 09:19:22.041009 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" podStartSLOduration=2.040988762 podStartE2EDuration="2.040988762s" podCreationTimestamp="2026-01-31 09:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:19:22.038489334 +0000 UTC m=+1100.344365538" watchObservedRunningTime="2026-01-31 09:19:22.040988762 +0000 UTC m=+1100.346864966" Jan 31 09:19:23 crc kubenswrapper[4732]: I0131 09:19:23.031709 4732 generic.go:334] "Generic (PLEG): container finished" podID="38fc93ea-e490-4d30-a742-97b668a286c5" containerID="432b5c5eba1b55f33733f64fa26fd00fcd2ddf085c54e19e85cffa05ca5842c9" exitCode=0 Jan 31 09:19:23 crc kubenswrapper[4732]: I0131 09:19:23.031790 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" event={"ID":"38fc93ea-e490-4d30-a742-97b668a286c5","Type":"ContainerDied","Data":"432b5c5eba1b55f33733f64fa26fd00fcd2ddf085c54e19e85cffa05ca5842c9"} Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.367837 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.407495 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gblbk"] Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.416557 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gblbk"] Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.462865 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-ring-data-devices\") pod \"38fc93ea-e490-4d30-a742-97b668a286c5\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.462940 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-scripts\") pod \"38fc93ea-e490-4d30-a742-97b668a286c5\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.463089 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-dispersionconf\") pod \"38fc93ea-e490-4d30-a742-97b668a286c5\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.463130 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38fc93ea-e490-4d30-a742-97b668a286c5-etc-swift\") pod \"38fc93ea-e490-4d30-a742-97b668a286c5\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.463168 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-swiftconf\") pod \"38fc93ea-e490-4d30-a742-97b668a286c5\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.463270 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s7fl\" (UniqueName: \"kubernetes.io/projected/38fc93ea-e490-4d30-a742-97b668a286c5-kube-api-access-2s7fl\") pod \"38fc93ea-e490-4d30-a742-97b668a286c5\" (UID: \"38fc93ea-e490-4d30-a742-97b668a286c5\") " Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.463318 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "38fc93ea-e490-4d30-a742-97b668a286c5" (UID: "38fc93ea-e490-4d30-a742-97b668a286c5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.463622 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.463968 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38fc93ea-e490-4d30-a742-97b668a286c5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "38fc93ea-e490-4d30-a742-97b668a286c5" (UID: "38fc93ea-e490-4d30-a742-97b668a286c5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.481949 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38fc93ea-e490-4d30-a742-97b668a286c5-kube-api-access-2s7fl" (OuterVolumeSpecName: "kube-api-access-2s7fl") pod "38fc93ea-e490-4d30-a742-97b668a286c5" (UID: "38fc93ea-e490-4d30-a742-97b668a286c5"). InnerVolumeSpecName "kube-api-access-2s7fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.513514 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "38fc93ea-e490-4d30-a742-97b668a286c5" (UID: "38fc93ea-e490-4d30-a742-97b668a286c5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.521268 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc962"] Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.521424 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "38fc93ea-e490-4d30-a742-97b668a286c5" (UID: "38fc93ea-e490-4d30-a742-97b668a286c5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.527543 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-scripts" (OuterVolumeSpecName: "scripts") pod "38fc93ea-e490-4d30-a742-97b668a286c5" (UID: "38fc93ea-e490-4d30-a742-97b668a286c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.532396 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-nc962"] Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.568715 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/38fc93ea-e490-4d30-a742-97b668a286c5-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.568751 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.568763 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/38fc93ea-e490-4d30-a742-97b668a286c5-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.568772 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/38fc93ea-e490-4d30-a742-97b668a286c5-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.568785 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2s7fl\" (UniqueName: \"kubernetes.io/projected/38fc93ea-e490-4d30-a742-97b668a286c5-kube-api-access-2s7fl\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.592529 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38fc93ea-e490-4d30-a742-97b668a286c5" path="/var/lib/kubelet/pods/38fc93ea-e490-4d30-a742-97b668a286c5/volumes" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.593151 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e75a6a-a8c3-4ea1-b9a9-418752e2fc72" path="/var/lib/kubelet/pods/69e75a6a-a8c3-4ea1-b9a9-418752e2fc72/volumes" Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.593886 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.593924 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq"] Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.594118 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" podUID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerName="proxy-httpd" containerID="cri-o://7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.594520 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" podUID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerName="proxy-server" containerID="cri-o://14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.595380 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-server" containerID="cri-o://76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.595581 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-sharder" containerID="cri-o://38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.595700 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="swift-recon-cron" containerID="cri-o://1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.595779 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="rsync" containerID="cri-o://5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.595849 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-expirer" containerID="cri-o://4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.595927 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-updater" containerID="cri-o://4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596028 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-auditor" containerID="cri-o://dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596105 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-replicator" containerID="cri-o://adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596224 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-server" containerID="cri-o://a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596240 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-replicator" containerID="cri-o://5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596347 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-reaper" containerID="cri-o://3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596422 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-server" containerID="cri-o://9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596439 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-auditor" containerID="cri-o://14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596550 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-replicator" containerID="cri-o://59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596556 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-updater" containerID="cri-o://c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7" gracePeriod=30 Jan 31 09:19:24 crc kubenswrapper[4732]: I0131 09:19:24.596623 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-auditor" containerID="cri-o://31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050" gracePeriod=30 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.053125 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gblbk" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.053122 4732 scope.go:117] "RemoveContainer" containerID="432b5c5eba1b55f33733f64fa26fd00fcd2ddf085c54e19e85cffa05ca5842c9" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061704 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061740 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061772 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061781 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061792 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061801 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061810 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061818 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061866 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061875 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061883 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061891 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061899 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061700 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061961 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061983 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.061996 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.062008 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.062019 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.062032 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.062044 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.062055 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.062067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.062080 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.062090 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.062101 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.064268 4732 generic.go:334] "Generic (PLEG): container finished" podID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerID="7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c" exitCode=0 Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.064293 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" event={"ID":"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9","Type":"ContainerDied","Data":"7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c"} Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.685501 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.787379 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-run-httpd\") pod \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.787482 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-etc-swift\") pod \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.787542 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-log-httpd\") pod \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.787585 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4pgj\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-kube-api-access-c4pgj\") pod \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.787736 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-config-data\") pod \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\" (UID: \"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9\") " Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.789356 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" (UID: "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.789430 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" (UID: "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.804647 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" (UID: "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.804934 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-kube-api-access-c4pgj" (OuterVolumeSpecName: "kube-api-access-c4pgj") pod "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" (UID: "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9"). InnerVolumeSpecName "kube-api-access-c4pgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.850801 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-config-data" (OuterVolumeSpecName: "config-data") pod "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" (UID: "cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.889454 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.889499 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.889512 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.889527 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4pgj\" (UniqueName: \"kubernetes.io/projected/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-kube-api-access-c4pgj\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:25 crc kubenswrapper[4732]: I0131 09:19:25.889541 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.089288 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a" exitCode=0 Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.089322 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21" exitCode=0 Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.089364 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a"} Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.089415 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21"} Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.091642 4732 generic.go:334] "Generic (PLEG): container finished" podID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerID="14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628" exitCode=0 Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.091713 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" event={"ID":"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9","Type":"ContainerDied","Data":"14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628"} Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.091750 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" event={"ID":"cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9","Type":"ContainerDied","Data":"84936fd9e60dfd8bff06c859b11fa4812f19d8ba83da60cc1b9233b2901d5e51"} Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.091772 4732 scope.go:117] "RemoveContainer" containerID="14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628" Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.091982 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq" Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.109419 4732 scope.go:117] "RemoveContainer" containerID="7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c" Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.124309 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq"] Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.130757 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-59dd5dbb7c-lcxsq"] Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.133084 4732 scope.go:117] "RemoveContainer" containerID="14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628" Jan 31 09:19:26 crc kubenswrapper[4732]: E0131 09:19:26.133540 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628\": container with ID starting with 14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628 not found: ID does not exist" containerID="14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628" Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.133619 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628"} err="failed to get container status \"14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628\": rpc error: code = NotFound desc = could not find container \"14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628\": container with ID starting with 14e28a091b47ebba4b423f5066db9e9d08dac2ef7f479ee58c476aa4be9c8628 not found: ID does not exist" Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.133682 4732 scope.go:117] "RemoveContainer" containerID="7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c" Jan 31 09:19:26 crc kubenswrapper[4732]: E0131 09:19:26.134166 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c\": container with ID starting with 7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c not found: ID does not exist" containerID="7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c" Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.134194 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c"} err="failed to get container status \"7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c\": rpc error: code = NotFound desc = could not find container \"7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c\": container with ID starting with 7645a81d1554d39a7245fbb85e3b78b90ce8cbb1ff6a3161695b88d390dd384c not found: ID does not exist" Jan 31 09:19:26 crc kubenswrapper[4732]: I0131 09:19:26.554936 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" path="/var/lib/kubelet/pods/cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9/volumes" Jan 31 09:19:47 crc kubenswrapper[4732]: I0131 09:19:47.498246 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:19:47 crc kubenswrapper[4732]: I0131 09:19:47.498882 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:19:54 crc kubenswrapper[4732]: I0131 09:19:54.960192 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.127034 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-cache\") pod \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.127102 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-lock\") pod \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.127205 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") pod \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.127267 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5fb9\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-kube-api-access-b5fb9\") pod \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.127426 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\" (UID: \"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65\") " Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.127570 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-lock" (OuterVolumeSpecName: "lock") pod "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.127965 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-cache" (OuterVolumeSpecName: "cache") pod "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.127974 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.133306 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.133344 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-kube-api-access-b5fb9" (OuterVolumeSpecName: "kube-api-access-b5fb9") pod "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65"). InnerVolumeSpecName "kube-api-access-b5fb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.133400 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" (UID: "1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.229162 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-cache\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.229212 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.229233 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5fb9\" (UniqueName: \"kubernetes.io/projected/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65-kube-api-access-b5fb9\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.229291 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.258988 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.330291 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.340384 4732 generic.go:334] "Generic (PLEG): container finished" podID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerID="1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479" exitCode=137 Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.340456 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479"} Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.340557 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65","Type":"ContainerDied","Data":"34458a4f4a5d5e0c1ee442a308b559938e3621f6dae5595249f026b6e797962d"} Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.340586 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.340594 4732 scope.go:117] "RemoveContainer" containerID="38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.369444 4732 scope.go:117] "RemoveContainer" containerID="1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.382717 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.393721 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.401049 4732 scope.go:117] "RemoveContainer" containerID="5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.416943 4732 scope.go:117] "RemoveContainer" containerID="4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.435508 4732 scope.go:117] "RemoveContainer" containerID="4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.451374 4732 scope.go:117] "RemoveContainer" containerID="dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.471065 4732 scope.go:117] "RemoveContainer" containerID="adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.494049 4732 scope.go:117] "RemoveContainer" containerID="9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.512249 4732 scope.go:117] "RemoveContainer" containerID="c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.531114 4732 scope.go:117] "RemoveContainer" containerID="31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.548981 4732 scope.go:117] "RemoveContainer" containerID="5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.567560 4732 scope.go:117] "RemoveContainer" containerID="a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.591036 4732 scope.go:117] "RemoveContainer" containerID="3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.609881 4732 scope.go:117] "RemoveContainer" containerID="14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.624676 4732 scope.go:117] "RemoveContainer" containerID="59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.641310 4732 scope.go:117] "RemoveContainer" containerID="76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.657677 4732 scope.go:117] "RemoveContainer" containerID="38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.658293 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337\": container with ID starting with 38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337 not found: ID does not exist" containerID="38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.658342 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337"} err="failed to get container status \"38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337\": rpc error: code = NotFound desc = could not find container \"38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337\": container with ID starting with 38cdf7c7cb89df8d1c3787fb632d47d09b331d20b08aa93cb17f6a88341fb337 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.658373 4732 scope.go:117] "RemoveContainer" containerID="1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.658909 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479\": container with ID starting with 1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479 not found: ID does not exist" containerID="1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.658938 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479"} err="failed to get container status \"1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479\": rpc error: code = NotFound desc = could not find container \"1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479\": container with ID starting with 1183d753033ef14dd779291a910acae735ec0ef3a1e7104272c1b8d040389479 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.658956 4732 scope.go:117] "RemoveContainer" containerID="5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.659295 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff\": container with ID starting with 5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff not found: ID does not exist" containerID="5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.659323 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff"} err="failed to get container status \"5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff\": rpc error: code = NotFound desc = could not find container \"5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff\": container with ID starting with 5521d09393a37a16e0269ff010dbf8c3432cdb69f1bca2e5b6f057535e6adaff not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.659340 4732 scope.go:117] "RemoveContainer" containerID="4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.659810 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8\": container with ID starting with 4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8 not found: ID does not exist" containerID="4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.659838 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8"} err="failed to get container status \"4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8\": rpc error: code = NotFound desc = could not find container \"4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8\": container with ID starting with 4c4fd17a487c3ad507bed8e66df924a39a8c2f99f7e806c1497f9605883141b8 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.659866 4732 scope.go:117] "RemoveContainer" containerID="4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.660681 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91\": container with ID starting with 4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91 not found: ID does not exist" containerID="4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.660710 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91"} err="failed to get container status \"4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91\": rpc error: code = NotFound desc = could not find container \"4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91\": container with ID starting with 4ff9ae6192a8f62a8feaec1a934822bc986fef6a2a789c88bf3fe4a983cc5f91 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.660733 4732 scope.go:117] "RemoveContainer" containerID="dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.661173 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844\": container with ID starting with dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844 not found: ID does not exist" containerID="dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.661203 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844"} err="failed to get container status \"dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844\": rpc error: code = NotFound desc = could not find container \"dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844\": container with ID starting with dd273c25d7dc91e5954ff8d5fbc44b6eeefa5af49ff09461c024f0d29819d844 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.661222 4732 scope.go:117] "RemoveContainer" containerID="adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.661536 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad\": container with ID starting with adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad not found: ID does not exist" containerID="adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.661589 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad"} err="failed to get container status \"adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad\": rpc error: code = NotFound desc = could not find container \"adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad\": container with ID starting with adf3519dd1a3dc4b8b31ed35e741ea5a2ae39875837269f96a619bf6446696ad not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.661623 4732 scope.go:117] "RemoveContainer" containerID="9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.661945 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849\": container with ID starting with 9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849 not found: ID does not exist" containerID="9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.661980 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849"} err="failed to get container status \"9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849\": rpc error: code = NotFound desc = could not find container \"9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849\": container with ID starting with 9c260d5b071de9f983a884926c74e3c7965a5566fdf129a8231af8aff9341849 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.662015 4732 scope.go:117] "RemoveContainer" containerID="c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.662334 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7\": container with ID starting with c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7 not found: ID does not exist" containerID="c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.662386 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7"} err="failed to get container status \"c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7\": rpc error: code = NotFound desc = could not find container \"c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7\": container with ID starting with c4fa65606e0bdcb6f86bd4340763a51bfe2fa0b29c86e336932e2d7b4d5a7cc7 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.662417 4732 scope.go:117] "RemoveContainer" containerID="31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.662806 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050\": container with ID starting with 31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050 not found: ID does not exist" containerID="31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.662838 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050"} err="failed to get container status \"31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050\": rpc error: code = NotFound desc = could not find container \"31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050\": container with ID starting with 31889d60dae084e50ec608b663e4d80dd770e7914146b1993679075fbcb5c050 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.662858 4732 scope.go:117] "RemoveContainer" containerID="5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.663100 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef\": container with ID starting with 5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef not found: ID does not exist" containerID="5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.663128 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef"} err="failed to get container status \"5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef\": rpc error: code = NotFound desc = could not find container \"5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef\": container with ID starting with 5de79c9073a05f38da4352247b5e904a357945349a71ed92906b206964dbd2ef not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.663147 4732 scope.go:117] "RemoveContainer" containerID="a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.663392 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a\": container with ID starting with a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a not found: ID does not exist" containerID="a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.663453 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a"} err="failed to get container status \"a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a\": rpc error: code = NotFound desc = could not find container \"a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a\": container with ID starting with a768c10034a64a5a20c9c6d0006c491e9b64bcedd25898cbbfa5a7d80612ad9a not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.663470 4732 scope.go:117] "RemoveContainer" containerID="3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.663772 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327\": container with ID starting with 3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327 not found: ID does not exist" containerID="3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.663801 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327"} err="failed to get container status \"3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327\": rpc error: code = NotFound desc = could not find container \"3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327\": container with ID starting with 3c0e4beb7194a2ba16fa5a8867712d4fc5df7cc2c8e674527846af12764e0327 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.663818 4732 scope.go:117] "RemoveContainer" containerID="14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.664115 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521\": container with ID starting with 14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521 not found: ID does not exist" containerID="14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.664144 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521"} err="failed to get container status \"14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521\": rpc error: code = NotFound desc = could not find container \"14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521\": container with ID starting with 14b965bc1ec24ff789b68d1102389631b9549153560571ce9d36806cdfc47521 not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.664160 4732 scope.go:117] "RemoveContainer" containerID="59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.664522 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f\": container with ID starting with 59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f not found: ID does not exist" containerID="59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.664548 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f"} err="failed to get container status \"59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f\": rpc error: code = NotFound desc = could not find container \"59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f\": container with ID starting with 59ef299c22aaa5865a8f99313b8748ea14863e08f6ff3878d52b70ef3516ec1f not found: ID does not exist" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.664565 4732 scope.go:117] "RemoveContainer" containerID="76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21" Jan 31 09:19:55 crc kubenswrapper[4732]: E0131 09:19:55.664832 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21\": container with ID starting with 76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21 not found: ID does not exist" containerID="76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21" Jan 31 09:19:55 crc kubenswrapper[4732]: I0131 09:19:55.664860 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21"} err="failed to get container status \"76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21\": rpc error: code = NotFound desc = could not find container \"76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21\": container with ID starting with 76755a52198a018f0a2d0f0ab371558b237a7993168d70acf02d525f01693f21 not found: ID does not exist" Jan 31 09:19:56 crc kubenswrapper[4732]: I0131 09:19:56.550589 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" path="/var/lib/kubelet/pods/1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65/volumes" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.370615 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq"] Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.370899 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerName="proxy-server" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.370914 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerName="proxy-server" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.370925 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38fc93ea-e490-4d30-a742-97b668a286c5" containerName="swift-ring-rebalance" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.370933 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="38fc93ea-e490-4d30-a742-97b668a286c5" containerName="swift-ring-rebalance" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.370944 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-auditor" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.370950 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-auditor" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.370964 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-reaper" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.370969 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-reaper" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.370983 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-replicator" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.370989 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-replicator" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.370999 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-server" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371005 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-server" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371013 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="rsync" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371019 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="rsync" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371029 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-expirer" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371034 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-expirer" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371045 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-server" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371050 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-server" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371057 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-auditor" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371063 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-auditor" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371070 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-updater" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371076 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-updater" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371084 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-server" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371090 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-server" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371097 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-updater" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371102 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-updater" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371111 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-auditor" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371118 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-auditor" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371126 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerName="proxy-httpd" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371132 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerName="proxy-httpd" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371139 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="swift-recon-cron" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371144 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="swift-recon-cron" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371154 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-replicator" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371159 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-replicator" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371168 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-replicator" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371173 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-replicator" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.371180 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-sharder" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371185 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-sharder" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371288 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-replicator" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371299 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-server" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371309 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-updater" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371318 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-reaper" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371324 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-replicator" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371332 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="rsync" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371339 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-server" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371347 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-replicator" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371354 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-updater" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371368 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="swift-recon-cron" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371383 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-sharder" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371395 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-auditor" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371405 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="38fc93ea-e490-4d30-a742-97b668a286c5" containerName="swift-ring-rebalance" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371413 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="account-auditor" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371423 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="object-expirer" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371431 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerName="proxy-server" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371440 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-server" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371446 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe66c2d-70ee-48c8-927c-4b1ba8d5cb65" containerName="container-auditor" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.371452 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea16bcf-b0b0-4d2a-ad37-ee127b9a1fd9" containerName="proxy-httpd" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.372156 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.375285 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.375384 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.375603 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-nghpf" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.376286 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.435098 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq"] Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.448729 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.453431 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.456060 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.459139 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.459279 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb5e63b-882d-4388-abb1-130923832c9f-config-data\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.459332 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-log-httpd\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.459377 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-run-httpd\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.459504 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n98z\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-kube-api-access-5n98z\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.477818 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.495752 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.495987 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.500735 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.500771 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.500893 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.501065 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.560954 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-lock\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.561023 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb5e63b-882d-4388-abb1-130923832c9f-config-data\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.561050 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-log-httpd\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.561083 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdfg\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-kube-api-access-9zdfg\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.561125 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-run-httpd\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.561146 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.561195 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n98z\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-kube-api-access-5n98z\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.561252 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.561281 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.561311 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-cache\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.562580 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-log-httpd\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.562652 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-run-httpd\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.562694 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.562711 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq: configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.562761 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift podName:8cb5e63b-882d-4388-abb1-130923832c9f nodeName:}" failed. No retries permitted until 2026-01-31 09:19:58.062739615 +0000 UTC m=+1136.368615919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift") pod "swift-proxy-7d8cf99555-lvwdq" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f") : configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.566410 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb5e63b-882d-4388-abb1-130923832c9f-config-data\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.584791 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n98z\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-kube-api-access-5n98z\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.662672 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.662952 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-lock\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.663064 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-cache\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.663293 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.663400 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.663476 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-lock\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.662861 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.663594 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.663657 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.663720 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift podName:ea3117d7-0038-4ca5-bee5-ae76db9a12eb nodeName:}" failed. No retries permitted until 2026-01-31 09:19:58.163698832 +0000 UTC m=+1136.469575036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift") pod "swift-storage-0" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb") : configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.663837 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqbp6\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-kube-api-access-pqbp6\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.663918 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-cache\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.663995 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txhcm\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-kube-api-access-txhcm\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.663795 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") device mount path \"/mnt/openstack/pv08\"" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.664131 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-cache\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.664245 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.664302 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-cache\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.664378 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-lock\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.664452 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.664537 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdfg\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-kube-api-access-9zdfg\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.664783 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-lock\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.683951 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.684565 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdfg\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-kube-api-access-9zdfg\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.765733 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.765790 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqbp6\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-kube-api-access-pqbp6\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.765825 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txhcm\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-kube-api-access-txhcm\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.765864 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-cache\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.765899 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.765927 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.765945 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") device mount path \"/mnt/openstack/pv06\"" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.765985 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-lock\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.766394 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-cache\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.766641 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.766684 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-2: configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.766720 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-cache\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.766737 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift podName:18b68f5e-a1b4-4f52-9a4e-5967735ec105 nodeName:}" failed. No retries permitted until 2026-01-31 09:19:58.266717982 +0000 UTC m=+1136.572594196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift") pod "swift-storage-2" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105") : configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.766765 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.766856 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-lock\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.766875 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-lock\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.767010 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") device mount path \"/mnt/openstack/pv11\"" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.767159 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.767244 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-1: configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.767181 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-cache\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.767364 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-lock\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: E0131 09:19:57.767452 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift podName:eb04e24b-fc92-4f2e-abcb-fa46706f699a nodeName:}" failed. No retries permitted until 2026-01-31 09:19:58.267384783 +0000 UTC m=+1136.573260987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift") pod "swift-storage-1" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a") : configmap "swift-ring-files" not found Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.787356 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.787622 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.790339 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txhcm\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-kube-api-access-txhcm\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:57 crc kubenswrapper[4732]: I0131 09:19:57.792239 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqbp6\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-kube-api-access-pqbp6\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:58 crc kubenswrapper[4732]: I0131 09:19:58.070944 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.071110 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.071127 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq: configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.071177 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift podName:8cb5e63b-882d-4388-abb1-130923832c9f nodeName:}" failed. No retries permitted until 2026-01-31 09:19:59.07116293 +0000 UTC m=+1137.377039134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift") pod "swift-proxy-7d8cf99555-lvwdq" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f") : configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: I0131 09:19:58.172357 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.172616 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.172681 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.172756 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift podName:ea3117d7-0038-4ca5-bee5-ae76db9a12eb nodeName:}" failed. No retries permitted until 2026-01-31 09:19:59.172733475 +0000 UTC m=+1137.478609749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift") pod "swift-storage-0" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb") : configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: I0131 09:19:58.274456 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:58 crc kubenswrapper[4732]: I0131 09:19:58.274610 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.274876 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.274901 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-1: configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.274966 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift podName:eb04e24b-fc92-4f2e-abcb-fa46706f699a nodeName:}" failed. No retries permitted until 2026-01-31 09:19:59.274945081 +0000 UTC m=+1137.580821315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift") pod "swift-storage-1" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a") : configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.275132 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.275160 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-2: configmap "swift-ring-files" not found Jan 31 09:19:58 crc kubenswrapper[4732]: E0131 09:19:58.275220 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift podName:18b68f5e-a1b4-4f52-9a4e-5967735ec105 nodeName:}" failed. No retries permitted until 2026-01-31 09:19:59.275202149 +0000 UTC m=+1137.581078353 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift") pod "swift-storage-2" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105") : configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: I0131 09:19:59.087765 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.088046 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.088097 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq: configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.088235 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift podName:8cb5e63b-882d-4388-abb1-130923832c9f nodeName:}" failed. No retries permitted until 2026-01-31 09:20:01.088196184 +0000 UTC m=+1139.394072438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift") pod "swift-proxy-7d8cf99555-lvwdq" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f") : configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: I0131 09:19:59.189909 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.190141 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.190200 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.190264 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift podName:ea3117d7-0038-4ca5-bee5-ae76db9a12eb nodeName:}" failed. No retries permitted until 2026-01-31 09:20:01.190240535 +0000 UTC m=+1139.496116739 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift") pod "swift-storage-0" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb") : configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: I0131 09:19:59.292048 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:19:59 crc kubenswrapper[4732]: I0131 09:19:59.292191 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.292328 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.292360 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-1: configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.292420 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift podName:eb04e24b-fc92-4f2e-abcb-fa46706f699a nodeName:}" failed. No retries permitted until 2026-01-31 09:20:01.292395728 +0000 UTC m=+1139.598272012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift") pod "swift-storage-1" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a") : configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.294763 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.294812 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-2: configmap "swift-ring-files" not found Jan 31 09:19:59 crc kubenswrapper[4732]: E0131 09:19:59.300548 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift podName:18b68f5e-a1b4-4f52-9a4e-5967735ec105 nodeName:}" failed. No retries permitted until 2026-01-31 09:20:01.294864494 +0000 UTC m=+1139.600740798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift") pod "swift-storage-2" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105") : configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.121532 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.121807 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.122032 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq: configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.122139 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift podName:8cb5e63b-882d-4388-abb1-130923832c9f nodeName:}" failed. No retries permitted until 2026-01-31 09:20:05.122100688 +0000 UTC m=+1143.427976942 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift") pod "swift-proxy-7d8cf99555-lvwdq" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f") : configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.223587 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.223891 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.223944 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.224049 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift podName:ea3117d7-0038-4ca5-bee5-ae76db9a12eb nodeName:}" failed. No retries permitted until 2026-01-31 09:20:05.224002643 +0000 UTC m=+1143.529878877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift") pod "swift-storage-0" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb") : configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.265659 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-7q9kp"] Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.267763 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.270949 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.271153 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.280462 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-7q9kp"] Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.325691 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.325894 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.325939 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.326067 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-1: configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.326120 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift podName:eb04e24b-fc92-4f2e-abcb-fa46706f699a nodeName:}" failed. No retries permitted until 2026-01-31 09:20:05.326102886 +0000 UTC m=+1143.631979090 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift") pod "swift-storage-1" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a") : configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.326022 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.326140 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-2: configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: E0131 09:20:01.326155 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift podName:18b68f5e-a1b4-4f52-9a4e-5967735ec105 nodeName:}" failed. No retries permitted until 2026-01-31 09:20:05.326149977 +0000 UTC m=+1143.632026181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift") pod "swift-storage-2" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105") : configmap "swift-ring-files" not found Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.426956 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-scripts\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.427014 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-swiftconf\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.427158 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e25bc49e-1bbe-4103-b751-fee5d86e7a92-etc-swift\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.427371 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-ring-data-devices\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.427433 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54c9x\" (UniqueName: \"kubernetes.io/projected/e25bc49e-1bbe-4103-b751-fee5d86e7a92-kube-api-access-54c9x\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.427585 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-dispersionconf\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.528808 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-ring-data-devices\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.528870 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54c9x\" (UniqueName: \"kubernetes.io/projected/e25bc49e-1bbe-4103-b751-fee5d86e7a92-kube-api-access-54c9x\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.528932 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-dispersionconf\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.529049 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-scripts\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.529103 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-swiftconf\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.529142 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e25bc49e-1bbe-4103-b751-fee5d86e7a92-etc-swift\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.529614 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e25bc49e-1bbe-4103-b751-fee5d86e7a92-etc-swift\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.530145 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-scripts\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.530337 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-ring-data-devices\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.544370 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-dispersionconf\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.547007 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-swiftconf\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.558941 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54c9x\" (UniqueName: \"kubernetes.io/projected/e25bc49e-1bbe-4103-b751-fee5d86e7a92-kube-api-access-54c9x\") pod \"swift-ring-rebalance-7q9kp\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:01 crc kubenswrapper[4732]: I0131 09:20:01.589327 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:02 crc kubenswrapper[4732]: I0131 09:20:02.040498 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-7q9kp"] Jan 31 09:20:02 crc kubenswrapper[4732]: I0131 09:20:02.395122 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" event={"ID":"e25bc49e-1bbe-4103-b751-fee5d86e7a92","Type":"ContainerStarted","Data":"2bf5f760920241f10f11d255f66394a5366cad9af6f5cf262601f44403aa7939"} Jan 31 09:20:02 crc kubenswrapper[4732]: I0131 09:20:02.395186 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" event={"ID":"e25bc49e-1bbe-4103-b751-fee5d86e7a92","Type":"ContainerStarted","Data":"2e9feb5a7bd4dc52e920699e4073a986cd013aa621986002fa31f113c2896401"} Jan 31 09:20:02 crc kubenswrapper[4732]: I0131 09:20:02.414807 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" podStartSLOduration=1.414792119 podStartE2EDuration="1.414792119s" podCreationTimestamp="2026-01-31 09:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:02.410463885 +0000 UTC m=+1140.716340139" watchObservedRunningTime="2026-01-31 09:20:02.414792119 +0000 UTC m=+1140.720668323" Jan 31 09:20:05 crc kubenswrapper[4732]: I0131 09:20:05.208481 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.208753 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.209129 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq: configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.209221 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift podName:8cb5e63b-882d-4388-abb1-130923832c9f nodeName:}" failed. No retries permitted until 2026-01-31 09:20:13.209194184 +0000 UTC m=+1151.515070388 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift") pod "swift-proxy-7d8cf99555-lvwdq" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f") : configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: I0131 09:20:05.310477 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.310815 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.310923 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.310989 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift podName:ea3117d7-0038-4ca5-bee5-ae76db9a12eb nodeName:}" failed. No retries permitted until 2026-01-31 09:20:13.310970646 +0000 UTC m=+1151.616846850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift") pod "swift-storage-0" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb") : configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: I0131 09:20:05.413018 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.413269 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: I0131 09:20:05.413293 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.413299 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-1: configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.413509 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift podName:eb04e24b-fc92-4f2e-abcb-fa46706f699a nodeName:}" failed. No retries permitted until 2026-01-31 09:20:13.41347767 +0000 UTC m=+1151.719353914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift") pod "swift-storage-1" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a") : configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.413372 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.413547 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-2: configmap "swift-ring-files" not found Jan 31 09:20:05 crc kubenswrapper[4732]: E0131 09:20:05.413617 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift podName:18b68f5e-a1b4-4f52-9a4e-5967735ec105 nodeName:}" failed. No retries permitted until 2026-01-31 09:20:13.413592404 +0000 UTC m=+1151.719468728 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift") pod "swift-storage-2" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105") : configmap "swift-ring-files" not found Jan 31 09:20:12 crc kubenswrapper[4732]: I0131 09:20:12.471356 4732 generic.go:334] "Generic (PLEG): container finished" podID="e25bc49e-1bbe-4103-b751-fee5d86e7a92" containerID="2bf5f760920241f10f11d255f66394a5366cad9af6f5cf262601f44403aa7939" exitCode=0 Jan 31 09:20:12 crc kubenswrapper[4732]: I0131 09:20:12.471408 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" event={"ID":"e25bc49e-1bbe-4103-b751-fee5d86e7a92","Type":"ContainerDied","Data":"2bf5f760920241f10f11d255f66394a5366cad9af6f5cf262601f44403aa7939"} Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.255930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.265798 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") pod \"swift-proxy-7d8cf99555-lvwdq\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.292517 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.357353 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.362482 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") pod \"swift-storage-0\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.373353 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.459048 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.459193 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.467922 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") pod \"swift-storage-2\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.468196 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") pod \"swift-storage-1\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.717755 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.731209 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.742021 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq"] Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.754183 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.858868 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.864545 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-swiftconf\") pod \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.864649 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-ring-data-devices\") pod \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.864723 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-scripts\") pod \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.864796 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e25bc49e-1bbe-4103-b751-fee5d86e7a92-etc-swift\") pod \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.864837 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-dispersionconf\") pod \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.864866 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54c9x\" (UniqueName: \"kubernetes.io/projected/e25bc49e-1bbe-4103-b751-fee5d86e7a92-kube-api-access-54c9x\") pod \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\" (UID: \"e25bc49e-1bbe-4103-b751-fee5d86e7a92\") " Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.865428 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e25bc49e-1bbe-4103-b751-fee5d86e7a92" (UID: "e25bc49e-1bbe-4103-b751-fee5d86e7a92"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.865992 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e25bc49e-1bbe-4103-b751-fee5d86e7a92-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e25bc49e-1bbe-4103-b751-fee5d86e7a92" (UID: "e25bc49e-1bbe-4103-b751-fee5d86e7a92"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.880517 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e25bc49e-1bbe-4103-b751-fee5d86e7a92-kube-api-access-54c9x" (OuterVolumeSpecName: "kube-api-access-54c9x") pod "e25bc49e-1bbe-4103-b751-fee5d86e7a92" (UID: "e25bc49e-1bbe-4103-b751-fee5d86e7a92"). InnerVolumeSpecName "kube-api-access-54c9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.905449 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-scripts" (OuterVolumeSpecName: "scripts") pod "e25bc49e-1bbe-4103-b751-fee5d86e7a92" (UID: "e25bc49e-1bbe-4103-b751-fee5d86e7a92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.915892 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e25bc49e-1bbe-4103-b751-fee5d86e7a92" (UID: "e25bc49e-1bbe-4103-b751-fee5d86e7a92"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.917166 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e25bc49e-1bbe-4103-b751-fee5d86e7a92" (UID: "e25bc49e-1bbe-4103-b751-fee5d86e7a92"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.966622 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.966652 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.966680 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e25bc49e-1bbe-4103-b751-fee5d86e7a92-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.966689 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e25bc49e-1bbe-4103-b751-fee5d86e7a92-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.966698 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e25bc49e-1bbe-4103-b751-fee5d86e7a92-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:13 crc kubenswrapper[4732]: I0131 09:20:13.966707 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54c9x\" (UniqueName: \"kubernetes.io/projected/e25bc49e-1bbe-4103-b751-fee5d86e7a92-kube-api-access-54c9x\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.260805 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:20:14 crc kubenswrapper[4732]: W0131 09:20:14.262862 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b68f5e_a1b4_4f52_9a4e_5967735ec105.slice/crio-9213fe58caadec775e92ed28d6bba2a8ef3c6d5b541e8e1e305d2da7b48f0916 WatchSource:0}: Error finding container 9213fe58caadec775e92ed28d6bba2a8ef3c6d5b541e8e1e305d2da7b48f0916: Status 404 returned error can't find the container with id 9213fe58caadec775e92ed28d6bba2a8ef3c6d5b541e8e1e305d2da7b48f0916 Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.333075 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:20:14 crc kubenswrapper[4732]: W0131 09:20:14.344576 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb04e24b_fc92_4f2e_abcb_fa46706f699a.slice/crio-484b2ea8fc5f3e2a8486b0977aa570fbe958192eb24867f074d00e930ab4b1a3 WatchSource:0}: Error finding container 484b2ea8fc5f3e2a8486b0977aa570fbe958192eb24867f074d00e930ab4b1a3: Status 404 returned error can't find the container with id 484b2ea8fc5f3e2a8486b0977aa570fbe958192eb24867f074d00e930ab4b1a3 Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.503991 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"6e7852a660c0fcf3225ca272b8e2af7fc11735fb24d2d14460cde6438a7beb2d"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.504032 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"8f37816afb654d1e776d0d4eec1d440d5af5c0af6bb7163bf1822d00c7129008"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.504041 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"d31d074c83b389c8db51d7fd48db465e4e9e513ed6084f0c730eaa69444cb6c5"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.504049 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"6568c920e3e41f0ca77451bc255f08043b793e9b23cf5a92a349e3e4e75234e1"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.505644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" event={"ID":"8cb5e63b-882d-4388-abb1-130923832c9f","Type":"ContainerStarted","Data":"4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.505693 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" event={"ID":"8cb5e63b-882d-4388-abb1-130923832c9f","Type":"ContainerStarted","Data":"783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.505707 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" event={"ID":"8cb5e63b-882d-4388-abb1-130923832c9f","Type":"ContainerStarted","Data":"32df1a9318cd9e4682742a9570f27db68b7c7b206dbaec4cd560f06f827fb57e"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.505795 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.505826 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.507016 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"484b2ea8fc5f3e2a8486b0977aa570fbe958192eb24867f074d00e930ab4b1a3"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.508438 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"9213fe58caadec775e92ed28d6bba2a8ef3c6d5b541e8e1e305d2da7b48f0916"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.514221 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" event={"ID":"e25bc49e-1bbe-4103-b751-fee5d86e7a92","Type":"ContainerDied","Data":"2e9feb5a7bd4dc52e920699e4073a986cd013aa621986002fa31f113c2896401"} Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.514253 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e9feb5a7bd4dc52e920699e4073a986cd013aa621986002fa31f113c2896401" Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.514330 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-7q9kp" Jan 31 09:20:14 crc kubenswrapper[4732]: I0131 09:20:14.537675 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" podStartSLOduration=17.537636681 podStartE2EDuration="17.537636681s" podCreationTimestamp="2026-01-31 09:19:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:14.53181473 +0000 UTC m=+1152.837690934" watchObservedRunningTime="2026-01-31 09:20:14.537636681 +0000 UTC m=+1152.843512885" Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.540265 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.540326 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.540342 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.540355 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.540370 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.546166 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"672f5feadfa214c6a0a943e614b3f6be2205caf0cfad789bc48eecd5126064c9"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.546268 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"8fa21222e4cd9c26bf7a8699721788dc5f3bcbad179ebfca9e1ff5dd4f7eca9f"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.546290 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"72926ae802ba11d13ab77c79539bfcc94151936d207f632232dd7d1f41349d6d"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.546304 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"fa17ab717c5754facbf3da31cb4afd3477236bd426c65c9122642f27b5886fb9"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.546361 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"caac1f6da96d02d4cab51830f080cd652323af90c1836570aef955dde0207cff"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.552629 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"655a856811f2b949bc2c94495ba00a853818330f4ec09b9aeffd1751288aff79"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.552693 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"61d667d399369be98c22e8ccc9f3b4cb18a2a2b489b34f036b2fc88c0e71e429"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.552706 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"039961fef86b21e927d5ebdf1e2eb67c58775d812f1269411f3f0411c895a43f"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.552718 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"ada6f8ad69317a784118495c96beb36d701ce7b36209a86b9a936d7cc110e3c2"} Jan 31 09:20:15 crc kubenswrapper[4732]: I0131 09:20:15.552730 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"8d252efa39cfce425e89f4670019c9177c3b0c93daee21298b31748b98d341f0"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.563417 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"b50b39b84c6087e4699184efc06a407c325204a64a826bce844fec4c5164858e"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.564732 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"3a8ebcfcb039bb39617fcee0e053ba202cf3bccf0891372d90f2647abbe5c1ea"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.564826 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"420022f2e68e290b989b3165de84484bcc24a80cd43dfb94fa6e26433aff9a55"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.564887 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"fd00da0aa47cdd0410c40f7add08c30b0950cfafc21a202b2619d006f368871b"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.564941 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"78f74f08eb824105edcd5f1f9501b3fa915219d39b763fb391f67d7c1c054292"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.587982 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.588113 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.588171 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.588225 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.615536 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"463912ed98b787656c483ddcf4b73e963647c6d5df261cfe3fad78758d6d1b9d"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.615577 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"7ce880a4ac662620fb475dd18995992d32b8be418497d54beba8a09c1335c1aa"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.615587 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"54fada92647ba12febbb920093f9bc8e7464da78204240a5a00d06baf380ab69"} Jan 31 09:20:16 crc kubenswrapper[4732]: I0131 09:20:16.615595 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"fd1347f500390657be8ce2ee2c537ab8800d5072a87c360ae87e57f1fb6a1882"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.498081 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.498392 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.498434 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.499023 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"99271a603de3a603b9be8d8f0bb791de0f202646de27403a5e3efc59790f637f"} pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.499078 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" containerID="cri-o://99271a603de3a603b9be8d8f0bb791de0f202646de27403a5e3efc59790f637f" gracePeriod=600 Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.627043 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d790207-d357-4b47-87bf-5b505e061820" containerID="99271a603de3a603b9be8d8f0bb791de0f202646de27403a5e3efc59790f637f" exitCode=0 Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.627111 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerDied","Data":"99271a603de3a603b9be8d8f0bb791de0f202646de27403a5e3efc59790f637f"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.627163 4732 scope.go:117] "RemoveContainer" containerID="7aa4af54c816ede2288939b5eacffccc23edb9afc7e2a36ef42fe01d52b4ae91" Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.636524 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.636584 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.636599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.636612 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.636624 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.636636 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerStarted","Data":"3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.649580 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.649649 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.649679 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.649692 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.649707 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.649717 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerStarted","Data":"3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.657440 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"186ff48385c2286a698108d59672234436dfc7dbb5bac5e777070affa544b217"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.657487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerStarted","Data":"f7bb7853880d63c47b634d75defb148b4ed41cf77de6d39f02020df2ff5b03d3"} Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.674781 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-1" podStartSLOduration=21.674763849 podStartE2EDuration="21.674763849s" podCreationTimestamp="2026-01-31 09:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:17.671355673 +0000 UTC m=+1155.977231877" watchObservedRunningTime="2026-01-31 09:20:17.674763849 +0000 UTC m=+1155.980640053" Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.714480 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-2" podStartSLOduration=21.714465738 podStartE2EDuration="21.714465738s" podCreationTimestamp="2026-01-31 09:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:17.710283059 +0000 UTC m=+1156.016159263" watchObservedRunningTime="2026-01-31 09:20:17.714465738 +0000 UTC m=+1156.020341942" Jan 31 09:20:17 crc kubenswrapper[4732]: I0131 09:20:17.759931 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=21.759910175999998 podStartE2EDuration="21.759910176s" podCreationTimestamp="2026-01-31 09:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:17.753118505 +0000 UTC m=+1156.058994719" watchObservedRunningTime="2026-01-31 09:20:17.759910176 +0000 UTC m=+1156.065786380" Jan 31 09:20:18 crc kubenswrapper[4732]: I0131 09:20:18.299699 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:18 crc kubenswrapper[4732]: I0131 09:20:18.667077 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"777b6bb11b5556f90e1c2a08822928a50217112bcd9efce47de0d5e1a98e3392"} Jan 31 09:20:23 crc kubenswrapper[4732]: I0131 09:20:23.296480 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.693840 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95zn5"] Jan 31 09:20:24 crc kubenswrapper[4732]: E0131 09:20:24.694235 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e25bc49e-1bbe-4103-b751-fee5d86e7a92" containerName="swift-ring-rebalance" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.694252 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e25bc49e-1bbe-4103-b751-fee5d86e7a92" containerName="swift-ring-rebalance" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.694429 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e25bc49e-1bbe-4103-b751-fee5d86e7a92" containerName="swift-ring-rebalance" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.695004 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.697649 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.697752 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.709240 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95zn5"] Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.754007 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-dispersionconf\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.754060 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-scripts\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.754076 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/957b624e-aeb2-4942-bf7d-9ee57f9d8462-etc-swift\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.754325 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-swiftconf\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.754524 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-ring-data-devices\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.754576 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzwl8\" (UniqueName: \"kubernetes.io/projected/957b624e-aeb2-4942-bf7d-9ee57f9d8462-kube-api-access-lzwl8\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.856237 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-scripts\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.856276 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/957b624e-aeb2-4942-bf7d-9ee57f9d8462-etc-swift\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.856320 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-swiftconf\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.856386 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-ring-data-devices\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.856405 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzwl8\" (UniqueName: \"kubernetes.io/projected/957b624e-aeb2-4942-bf7d-9ee57f9d8462-kube-api-access-lzwl8\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.856423 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-dispersionconf\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.857128 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-ring-data-devices\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.857196 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-scripts\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.857352 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/957b624e-aeb2-4942-bf7d-9ee57f9d8462-etc-swift\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.861992 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-swiftconf\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.862314 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-dispersionconf\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:24 crc kubenswrapper[4732]: I0131 09:20:24.880860 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzwl8\" (UniqueName: \"kubernetes.io/projected/957b624e-aeb2-4942-bf7d-9ee57f9d8462-kube-api-access-lzwl8\") pod \"swift-ring-rebalance-debug-95zn5\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:25 crc kubenswrapper[4732]: I0131 09:20:25.023026 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:25 crc kubenswrapper[4732]: I0131 09:20:25.476351 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95zn5"] Jan 31 09:20:25 crc kubenswrapper[4732]: W0131 09:20:25.477207 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod957b624e_aeb2_4942_bf7d_9ee57f9d8462.slice/crio-bb4258966ea064b1f1ecc473c6e6e539698b32e5021c92e78612f7004423ee2f WatchSource:0}: Error finding container bb4258966ea064b1f1ecc473c6e6e539698b32e5021c92e78612f7004423ee2f: Status 404 returned error can't find the container with id bb4258966ea064b1f1ecc473c6e6e539698b32e5021c92e78612f7004423ee2f Jan 31 09:20:25 crc kubenswrapper[4732]: I0131 09:20:25.741707 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" event={"ID":"957b624e-aeb2-4942-bf7d-9ee57f9d8462","Type":"ContainerStarted","Data":"5094c199168505dd687c51c34f8737c8e1d1a70fa2bf148748212d9f385b755e"} Jan 31 09:20:25 crc kubenswrapper[4732]: I0131 09:20:25.741953 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" event={"ID":"957b624e-aeb2-4942-bf7d-9ee57f9d8462","Type":"ContainerStarted","Data":"bb4258966ea064b1f1ecc473c6e6e539698b32e5021c92e78612f7004423ee2f"} Jan 31 09:20:25 crc kubenswrapper[4732]: I0131 09:20:25.762470 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" podStartSLOduration=1.762443922 podStartE2EDuration="1.762443922s" podCreationTimestamp="2026-01-31 09:20:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:25.755099384 +0000 UTC m=+1164.060975588" watchObservedRunningTime="2026-01-31 09:20:25.762443922 +0000 UTC m=+1164.068320166" Jan 31 09:20:28 crc kubenswrapper[4732]: I0131 09:20:28.768792 4732 generic.go:334] "Generic (PLEG): container finished" podID="957b624e-aeb2-4942-bf7d-9ee57f9d8462" containerID="5094c199168505dd687c51c34f8737c8e1d1a70fa2bf148748212d9f385b755e" exitCode=0 Jan 31 09:20:28 crc kubenswrapper[4732]: I0131 09:20:28.768842 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" event={"ID":"957b624e-aeb2-4942-bf7d-9ee57f9d8462","Type":"ContainerDied","Data":"5094c199168505dd687c51c34f8737c8e1d1a70fa2bf148748212d9f385b755e"} Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.035506 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.067816 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95zn5"] Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.075059 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95zn5"] Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.139633 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-dispersionconf\") pod \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.139766 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-swiftconf\") pod \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.139798 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-ring-data-devices\") pod \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.139864 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/957b624e-aeb2-4942-bf7d-9ee57f9d8462-etc-swift\") pod \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.139905 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzwl8\" (UniqueName: \"kubernetes.io/projected/957b624e-aeb2-4942-bf7d-9ee57f9d8462-kube-api-access-lzwl8\") pod \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.140452 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "957b624e-aeb2-4942-bf7d-9ee57f9d8462" (UID: "957b624e-aeb2-4942-bf7d-9ee57f9d8462"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.140545 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-scripts\") pod \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\" (UID: \"957b624e-aeb2-4942-bf7d-9ee57f9d8462\") " Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.140680 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/957b624e-aeb2-4942-bf7d-9ee57f9d8462-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "957b624e-aeb2-4942-bf7d-9ee57f9d8462" (UID: "957b624e-aeb2-4942-bf7d-9ee57f9d8462"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.140909 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.140926 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/957b624e-aeb2-4942-bf7d-9ee57f9d8462-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.145281 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957b624e-aeb2-4942-bf7d-9ee57f9d8462-kube-api-access-lzwl8" (OuterVolumeSpecName: "kube-api-access-lzwl8") pod "957b624e-aeb2-4942-bf7d-9ee57f9d8462" (UID: "957b624e-aeb2-4942-bf7d-9ee57f9d8462"). InnerVolumeSpecName "kube-api-access-lzwl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.160388 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-scripts" (OuterVolumeSpecName: "scripts") pod "957b624e-aeb2-4942-bf7d-9ee57f9d8462" (UID: "957b624e-aeb2-4942-bf7d-9ee57f9d8462"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.165114 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "957b624e-aeb2-4942-bf7d-9ee57f9d8462" (UID: "957b624e-aeb2-4942-bf7d-9ee57f9d8462"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.171315 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "957b624e-aeb2-4942-bf7d-9ee57f9d8462" (UID: "957b624e-aeb2-4942-bf7d-9ee57f9d8462"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.226684 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h"] Jan 31 09:20:30 crc kubenswrapper[4732]: E0131 09:20:30.227003 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957b624e-aeb2-4942-bf7d-9ee57f9d8462" containerName="swift-ring-rebalance" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.227024 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="957b624e-aeb2-4942-bf7d-9ee57f9d8462" containerName="swift-ring-rebalance" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.227310 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="957b624e-aeb2-4942-bf7d-9ee57f9d8462" containerName="swift-ring-rebalance" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.228119 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.236288 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h"] Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.241956 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/957b624e-aeb2-4942-bf7d-9ee57f9d8462-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.241992 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.242004 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/957b624e-aeb2-4942-bf7d-9ee57f9d8462-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.242016 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzwl8\" (UniqueName: \"kubernetes.io/projected/957b624e-aeb2-4942-bf7d-9ee57f9d8462-kube-api-access-lzwl8\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.343527 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-ring-data-devices\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.343595 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-dispersionconf\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.343630 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zrzf\" (UniqueName: \"kubernetes.io/projected/1add18f8-8147-4b68-ba76-f331c3e04734-kube-api-access-6zrzf\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.343702 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-swiftconf\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.343722 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-scripts\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.343749 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1add18f8-8147-4b68-ba76-f331c3e04734-etc-swift\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.444468 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-ring-data-devices\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.444545 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-dispersionconf\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.444579 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zrzf\" (UniqueName: \"kubernetes.io/projected/1add18f8-8147-4b68-ba76-f331c3e04734-kube-api-access-6zrzf\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.444624 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-swiftconf\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.444640 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-scripts\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.444687 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1add18f8-8147-4b68-ba76-f331c3e04734-etc-swift\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.445289 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1add18f8-8147-4b68-ba76-f331c3e04734-etc-swift\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.445344 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-scripts\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.445857 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-ring-data-devices\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.448298 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-dispersionconf\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.456194 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-swiftconf\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.460013 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zrzf\" (UniqueName: \"kubernetes.io/projected/1add18f8-8147-4b68-ba76-f331c3e04734-kube-api-access-6zrzf\") pod \"swift-ring-rebalance-debug-7rl5h\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.554377 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957b624e-aeb2-4942-bf7d-9ee57f9d8462" path="/var/lib/kubelet/pods/957b624e-aeb2-4942-bf7d-9ee57f9d8462/volumes" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.557765 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.784756 4732 scope.go:117] "RemoveContainer" containerID="5094c199168505dd687c51c34f8737c8e1d1a70fa2bf148748212d9f385b755e" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.784812 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95zn5" Jan 31 09:20:30 crc kubenswrapper[4732]: I0131 09:20:30.964965 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h"] Jan 31 09:20:31 crc kubenswrapper[4732]: I0131 09:20:31.792980 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" event={"ID":"1add18f8-8147-4b68-ba76-f331c3e04734","Type":"ContainerStarted","Data":"1afc72dc67226efea88d83cce600c96a720e1c283b758b093373dafe9a1d70f3"} Jan 31 09:20:31 crc kubenswrapper[4732]: I0131 09:20:31.793287 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" event={"ID":"1add18f8-8147-4b68-ba76-f331c3e04734","Type":"ContainerStarted","Data":"c32eab1c06cf9ce26317b383c741c8bb9495ea22d9e847ac0336329acc318f8a"} Jan 31 09:20:31 crc kubenswrapper[4732]: I0131 09:20:31.811011 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" podStartSLOduration=1.810993319 podStartE2EDuration="1.810993319s" podCreationTimestamp="2026-01-31 09:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:31.809225634 +0000 UTC m=+1170.115101838" watchObservedRunningTime="2026-01-31 09:20:31.810993319 +0000 UTC m=+1170.116869523" Jan 31 09:20:33 crc kubenswrapper[4732]: I0131 09:20:33.809645 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" event={"ID":"1add18f8-8147-4b68-ba76-f331c3e04734","Type":"ContainerDied","Data":"1afc72dc67226efea88d83cce600c96a720e1c283b758b093373dafe9a1d70f3"} Jan 31 09:20:33 crc kubenswrapper[4732]: I0131 09:20:33.809654 4732 generic.go:334] "Generic (PLEG): container finished" podID="1add18f8-8147-4b68-ba76-f331c3e04734" containerID="1afc72dc67226efea88d83cce600c96a720e1c283b758b093373dafe9a1d70f3" exitCode=0 Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.132121 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.161592 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h"] Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.168340 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h"] Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.211213 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-ring-data-devices\") pod \"1add18f8-8147-4b68-ba76-f331c3e04734\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.211298 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-scripts\") pod \"1add18f8-8147-4b68-ba76-f331c3e04734\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.211327 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zrzf\" (UniqueName: \"kubernetes.io/projected/1add18f8-8147-4b68-ba76-f331c3e04734-kube-api-access-6zrzf\") pod \"1add18f8-8147-4b68-ba76-f331c3e04734\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.211360 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-swiftconf\") pod \"1add18f8-8147-4b68-ba76-f331c3e04734\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.211395 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1add18f8-8147-4b68-ba76-f331c3e04734-etc-swift\") pod \"1add18f8-8147-4b68-ba76-f331c3e04734\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.211423 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-dispersionconf\") pod \"1add18f8-8147-4b68-ba76-f331c3e04734\" (UID: \"1add18f8-8147-4b68-ba76-f331c3e04734\") " Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.212945 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1add18f8-8147-4b68-ba76-f331c3e04734" (UID: "1add18f8-8147-4b68-ba76-f331c3e04734"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.213352 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1add18f8-8147-4b68-ba76-f331c3e04734-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1add18f8-8147-4b68-ba76-f331c3e04734" (UID: "1add18f8-8147-4b68-ba76-f331c3e04734"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.217071 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1add18f8-8147-4b68-ba76-f331c3e04734-kube-api-access-6zrzf" (OuterVolumeSpecName: "kube-api-access-6zrzf") pod "1add18f8-8147-4b68-ba76-f331c3e04734" (UID: "1add18f8-8147-4b68-ba76-f331c3e04734"). InnerVolumeSpecName "kube-api-access-6zrzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.229965 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-scripts" (OuterVolumeSpecName: "scripts") pod "1add18f8-8147-4b68-ba76-f331c3e04734" (UID: "1add18f8-8147-4b68-ba76-f331c3e04734"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.232858 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1add18f8-8147-4b68-ba76-f331c3e04734" (UID: "1add18f8-8147-4b68-ba76-f331c3e04734"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.237927 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1add18f8-8147-4b68-ba76-f331c3e04734" (UID: "1add18f8-8147-4b68-ba76-f331c3e04734"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.312730 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.312764 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zrzf\" (UniqueName: \"kubernetes.io/projected/1add18f8-8147-4b68-ba76-f331c3e04734-kube-api-access-6zrzf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.312779 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.312793 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1add18f8-8147-4b68-ba76-f331c3e04734-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.312804 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1add18f8-8147-4b68-ba76-f331c3e04734-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.312817 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1add18f8-8147-4b68-ba76-f331c3e04734-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.506228 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fcl52"] Jan 31 09:20:35 crc kubenswrapper[4732]: E0131 09:20:35.506732 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1add18f8-8147-4b68-ba76-f331c3e04734" containerName="swift-ring-rebalance" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.506748 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1add18f8-8147-4b68-ba76-f331c3e04734" containerName="swift-ring-rebalance" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.506930 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1add18f8-8147-4b68-ba76-f331c3e04734" containerName="swift-ring-rebalance" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.507459 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.535276 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fcl52"] Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.618408 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwjhc\" (UniqueName: \"kubernetes.io/projected/f60c0633-b625-41ee-9547-276007d47773-kube-api-access-pwjhc\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.618486 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-scripts\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.618621 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f60c0633-b625-41ee-9547-276007d47773-etc-swift\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.618703 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-swiftconf\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.618732 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-dispersionconf\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.618776 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-ring-data-devices\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.720109 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-swiftconf\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.720365 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-dispersionconf\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.720466 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-ring-data-devices\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.720614 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwjhc\" (UniqueName: \"kubernetes.io/projected/f60c0633-b625-41ee-9547-276007d47773-kube-api-access-pwjhc\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.720740 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-scripts\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.720871 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f60c0633-b625-41ee-9547-276007d47773-etc-swift\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.721307 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f60c0633-b625-41ee-9547-276007d47773-etc-swift\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.721518 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-ring-data-devices\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.721524 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-scripts\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.724119 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-swiftconf\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.724641 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-dispersionconf\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.738739 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwjhc\" (UniqueName: \"kubernetes.io/projected/f60c0633-b625-41ee-9547-276007d47773-kube-api-access-pwjhc\") pod \"swift-ring-rebalance-debug-fcl52\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.825265 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c32eab1c06cf9ce26317b383c741c8bb9495ea22d9e847ac0336329acc318f8a" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.825347 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7rl5h" Jan 31 09:20:35 crc kubenswrapper[4732]: I0131 09:20:35.841275 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:36 crc kubenswrapper[4732]: I0131 09:20:36.045169 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fcl52"] Jan 31 09:20:36 crc kubenswrapper[4732]: W0131 09:20:36.045713 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf60c0633_b625_41ee_9547_276007d47773.slice/crio-cd90a06411a178d46fc5b2088e3445ee49d4f2f23efb76c48e04889c786b81de WatchSource:0}: Error finding container cd90a06411a178d46fc5b2088e3445ee49d4f2f23efb76c48e04889c786b81de: Status 404 returned error can't find the container with id cd90a06411a178d46fc5b2088e3445ee49d4f2f23efb76c48e04889c786b81de Jan 31 09:20:36 crc kubenswrapper[4732]: I0131 09:20:36.561740 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1add18f8-8147-4b68-ba76-f331c3e04734" path="/var/lib/kubelet/pods/1add18f8-8147-4b68-ba76-f331c3e04734/volumes" Jan 31 09:20:36 crc kubenswrapper[4732]: I0131 09:20:36.835357 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" event={"ID":"f60c0633-b625-41ee-9547-276007d47773","Type":"ContainerStarted","Data":"29c732c7d142dfca9f679c8ac3af3f06cbb08a8c2b215575b2a3b5e1d907c9bb"} Jan 31 09:20:36 crc kubenswrapper[4732]: I0131 09:20:36.835409 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" event={"ID":"f60c0633-b625-41ee-9547-276007d47773","Type":"ContainerStarted","Data":"cd90a06411a178d46fc5b2088e3445ee49d4f2f23efb76c48e04889c786b81de"} Jan 31 09:20:36 crc kubenswrapper[4732]: I0131 09:20:36.865097 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" podStartSLOduration=1.8650802579999999 podStartE2EDuration="1.865080258s" podCreationTimestamp="2026-01-31 09:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:36.861964743 +0000 UTC m=+1175.167840967" watchObservedRunningTime="2026-01-31 09:20:36.865080258 +0000 UTC m=+1175.170956462" Jan 31 09:20:37 crc kubenswrapper[4732]: I0131 09:20:37.844824 4732 generic.go:334] "Generic (PLEG): container finished" podID="f60c0633-b625-41ee-9547-276007d47773" containerID="29c732c7d142dfca9f679c8ac3af3f06cbb08a8c2b215575b2a3b5e1d907c9bb" exitCode=0 Jan 31 09:20:37 crc kubenswrapper[4732]: I0131 09:20:37.844960 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" event={"ID":"f60c0633-b625-41ee-9547-276007d47773","Type":"ContainerDied","Data":"29c732c7d142dfca9f679c8ac3af3f06cbb08a8c2b215575b2a3b5e1d907c9bb"} Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.239980 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.278699 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f60c0633-b625-41ee-9547-276007d47773-etc-swift\") pod \"f60c0633-b625-41ee-9547-276007d47773\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.278822 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-swiftconf\") pod \"f60c0633-b625-41ee-9547-276007d47773\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.278910 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-dispersionconf\") pod \"f60c0633-b625-41ee-9547-276007d47773\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.278947 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-scripts\") pod \"f60c0633-b625-41ee-9547-276007d47773\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.278968 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwjhc\" (UniqueName: \"kubernetes.io/projected/f60c0633-b625-41ee-9547-276007d47773-kube-api-access-pwjhc\") pod \"f60c0633-b625-41ee-9547-276007d47773\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.279002 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-ring-data-devices\") pod \"f60c0633-b625-41ee-9547-276007d47773\" (UID: \"f60c0633-b625-41ee-9547-276007d47773\") " Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.279510 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f60c0633-b625-41ee-9547-276007d47773-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f60c0633-b625-41ee-9547-276007d47773" (UID: "f60c0633-b625-41ee-9547-276007d47773"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.279749 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f60c0633-b625-41ee-9547-276007d47773" (UID: "f60c0633-b625-41ee-9547-276007d47773"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.300132 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fcl52"] Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.303862 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f60c0633-b625-41ee-9547-276007d47773-kube-api-access-pwjhc" (OuterVolumeSpecName: "kube-api-access-pwjhc") pod "f60c0633-b625-41ee-9547-276007d47773" (UID: "f60c0633-b625-41ee-9547-276007d47773"). InnerVolumeSpecName "kube-api-access-pwjhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.305631 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f60c0633-b625-41ee-9547-276007d47773" (UID: "f60c0633-b625-41ee-9547-276007d47773"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.307834 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fcl52"] Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.318306 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-scripts" (OuterVolumeSpecName: "scripts") pod "f60c0633-b625-41ee-9547-276007d47773" (UID: "f60c0633-b625-41ee-9547-276007d47773"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.326501 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f60c0633-b625-41ee-9547-276007d47773" (UID: "f60c0633-b625-41ee-9547-276007d47773"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.380603 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.380635 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwjhc\" (UniqueName: \"kubernetes.io/projected/f60c0633-b625-41ee-9547-276007d47773-kube-api-access-pwjhc\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.380646 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f60c0633-b625-41ee-9547-276007d47773-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.380654 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f60c0633-b625-41ee-9547-276007d47773-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.380680 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.380689 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f60c0633-b625-41ee-9547-276007d47773-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.865004 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd90a06411a178d46fc5b2088e3445ee49d4f2f23efb76c48e04889c786b81de" Jan 31 09:20:39 crc kubenswrapper[4732]: I0131 09:20:39.865094 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fcl52" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.447931 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6"] Jan 31 09:20:40 crc kubenswrapper[4732]: E0131 09:20:40.448617 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f60c0633-b625-41ee-9547-276007d47773" containerName="swift-ring-rebalance" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.448634 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f60c0633-b625-41ee-9547-276007d47773" containerName="swift-ring-rebalance" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.448833 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f60c0633-b625-41ee-9547-276007d47773" containerName="swift-ring-rebalance" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.449424 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.451753 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.454510 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.470059 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6"] Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.496647 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-etc-swift\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.496730 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-scripts\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.496937 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-ring-data-devices\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.496994 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-swiftconf\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.497034 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gbpc\" (UniqueName: \"kubernetes.io/projected/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-kube-api-access-5gbpc\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.497074 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-dispersionconf\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.554538 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f60c0633-b625-41ee-9547-276007d47773" path="/var/lib/kubelet/pods/f60c0633-b625-41ee-9547-276007d47773/volumes" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.598387 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-etc-swift\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.598455 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-scripts\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.598528 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-ring-data-devices\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.598553 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-swiftconf\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.598588 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gbpc\" (UniqueName: \"kubernetes.io/projected/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-kube-api-access-5gbpc\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.598621 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-dispersionconf\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.598936 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-etc-swift\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.599333 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-ring-data-devices\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.599391 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-scripts\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.602236 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-dispersionconf\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.603590 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-swiftconf\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.615991 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gbpc\" (UniqueName: \"kubernetes.io/projected/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-kube-api-access-5gbpc\") pod \"swift-ring-rebalance-debug-6f8g6\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:40 crc kubenswrapper[4732]: I0131 09:20:40.764415 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:41 crc kubenswrapper[4732]: I0131 09:20:41.106070 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6"] Jan 31 09:20:41 crc kubenswrapper[4732]: W0131 09:20:41.109673 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7f454bc_bfe4_4d0f_b300_0a3b2f12f623.slice/crio-2f3db159d081f9cd8fc05d03a5d964ef0ea033b67d2d74b2f94372a6f829841e WatchSource:0}: Error finding container 2f3db159d081f9cd8fc05d03a5d964ef0ea033b67d2d74b2f94372a6f829841e: Status 404 returned error can't find the container with id 2f3db159d081f9cd8fc05d03a5d964ef0ea033b67d2d74b2f94372a6f829841e Jan 31 09:20:41 crc kubenswrapper[4732]: I0131 09:20:41.930392 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" event={"ID":"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623","Type":"ContainerStarted","Data":"b32e75e7ececb21d07a3a37d4719e75555ba099ce802819a06e6d312b37e5f13"} Jan 31 09:20:41 crc kubenswrapper[4732]: I0131 09:20:41.930809 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" event={"ID":"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623","Type":"ContainerStarted","Data":"2f3db159d081f9cd8fc05d03a5d964ef0ea033b67d2d74b2f94372a6f829841e"} Jan 31 09:20:41 crc kubenswrapper[4732]: I0131 09:20:41.950390 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" podStartSLOduration=1.950367516 podStartE2EDuration="1.950367516s" podCreationTimestamp="2026-01-31 09:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:41.947471627 +0000 UTC m=+1180.253347831" watchObservedRunningTime="2026-01-31 09:20:41.950367516 +0000 UTC m=+1180.256243730" Jan 31 09:20:42 crc kubenswrapper[4732]: I0131 09:20:42.943247 4732 generic.go:334] "Generic (PLEG): container finished" podID="c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" containerID="b32e75e7ececb21d07a3a37d4719e75555ba099ce802819a06e6d312b37e5f13" exitCode=0 Jan 31 09:20:42 crc kubenswrapper[4732]: I0131 09:20:42.943357 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" event={"ID":"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623","Type":"ContainerDied","Data":"b32e75e7ececb21d07a3a37d4719e75555ba099ce802819a06e6d312b37e5f13"} Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.226102 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.264397 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-etc-swift\") pod \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.264880 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-swiftconf\") pod \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.264943 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-ring-data-devices\") pod \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.265012 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-scripts\") pod \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.265052 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-dispersionconf\") pod \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.265140 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" (UID: "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.265243 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gbpc\" (UniqueName: \"kubernetes.io/projected/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-kube-api-access-5gbpc\") pod \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\" (UID: \"c7f454bc-bfe4-4d0f-b300-0a3b2f12f623\") " Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.265400 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" (UID: "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.265686 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.265722 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.269718 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6"] Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.274782 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6"] Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.291989 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-kube-api-access-5gbpc" (OuterVolumeSpecName: "kube-api-access-5gbpc") pod "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" (UID: "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623"). InnerVolumeSpecName "kube-api-access-5gbpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.295558 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-scripts" (OuterVolumeSpecName: "scripts") pod "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" (UID: "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.297501 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" (UID: "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.302283 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" (UID: "c7f454bc-bfe4-4d0f-b300-0a3b2f12f623"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.366958 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.366999 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.367014 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gbpc\" (UniqueName: \"kubernetes.io/projected/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-kube-api-access-5gbpc\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.367029 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.558489 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" path="/var/lib/kubelet/pods/c7f454bc-bfe4-4d0f-b300-0a3b2f12f623/volumes" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.961730 4732 scope.go:117] "RemoveContainer" containerID="b32e75e7ececb21d07a3a37d4719e75555ba099ce802819a06e6d312b37e5f13" Jan 31 09:20:44 crc kubenswrapper[4732]: I0131 09:20:44.961750 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6f8g6" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.397281 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh"] Jan 31 09:20:45 crc kubenswrapper[4732]: E0131 09:20:45.397644 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" containerName="swift-ring-rebalance" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.397673 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" containerName="swift-ring-rebalance" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.397891 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f454bc-bfe4-4d0f-b300-0a3b2f12f623" containerName="swift-ring-rebalance" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.398484 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.404050 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.404066 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.413084 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh"] Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.482110 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-dispersionconf\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.482365 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-scripts\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.482421 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-ring-data-devices\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.482445 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6eab1-4604-4066-ab41-f102cf79889e-etc-swift\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.482460 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mc4q\" (UniqueName: \"kubernetes.io/projected/8fb6eab1-4604-4066-ab41-f102cf79889e-kube-api-access-6mc4q\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.482522 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-swiftconf\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.583978 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6eab1-4604-4066-ab41-f102cf79889e-etc-swift\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.584015 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mc4q\" (UniqueName: \"kubernetes.io/projected/8fb6eab1-4604-4066-ab41-f102cf79889e-kube-api-access-6mc4q\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.584062 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-swiftconf\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.584128 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-dispersionconf\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.584166 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-scripts\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.584217 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-ring-data-devices\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.585099 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-ring-data-devices\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.585117 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-scripts\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.585649 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6eab1-4604-4066-ab41-f102cf79889e-etc-swift\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.588790 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-swiftconf\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.589325 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-dispersionconf\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.601830 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mc4q\" (UniqueName: \"kubernetes.io/projected/8fb6eab1-4604-4066-ab41-f102cf79889e-kube-api-access-6mc4q\") pod \"swift-ring-rebalance-debug-xdhsh\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:45 crc kubenswrapper[4732]: I0131 09:20:45.719444 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:46 crc kubenswrapper[4732]: I0131 09:20:46.134453 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh"] Jan 31 09:20:46 crc kubenswrapper[4732]: W0131 09:20:46.148369 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fb6eab1_4604_4066_ab41_f102cf79889e.slice/crio-8bbe4227d09313bee386c292dc466bffc18322a4585e9a2d48b5996659f4326c WatchSource:0}: Error finding container 8bbe4227d09313bee386c292dc466bffc18322a4585e9a2d48b5996659f4326c: Status 404 returned error can't find the container with id 8bbe4227d09313bee386c292dc466bffc18322a4585e9a2d48b5996659f4326c Jan 31 09:20:46 crc kubenswrapper[4732]: I0131 09:20:46.986654 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" event={"ID":"8fb6eab1-4604-4066-ab41-f102cf79889e","Type":"ContainerStarted","Data":"5e80e6297ed4cb6407583fc9a3cefb0d406579b8e762ae1e494844cf4e75d919"} Jan 31 09:20:46 crc kubenswrapper[4732]: I0131 09:20:46.987189 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" event={"ID":"8fb6eab1-4604-4066-ab41-f102cf79889e","Type":"ContainerStarted","Data":"8bbe4227d09313bee386c292dc466bffc18322a4585e9a2d48b5996659f4326c"} Jan 31 09:20:47 crc kubenswrapper[4732]: I0131 09:20:47.012699 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" podStartSLOduration=2.012644021 podStartE2EDuration="2.012644021s" podCreationTimestamp="2026-01-31 09:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:47.01034006 +0000 UTC m=+1185.316216264" watchObservedRunningTime="2026-01-31 09:20:47.012644021 +0000 UTC m=+1185.318520225" Jan 31 09:20:48 crc kubenswrapper[4732]: I0131 09:20:47.999174 4732 generic.go:334] "Generic (PLEG): container finished" podID="8fb6eab1-4604-4066-ab41-f102cf79889e" containerID="5e80e6297ed4cb6407583fc9a3cefb0d406579b8e762ae1e494844cf4e75d919" exitCode=0 Jan 31 09:20:48 crc kubenswrapper[4732]: I0131 09:20:47.999426 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" event={"ID":"8fb6eab1-4604-4066-ab41-f102cf79889e","Type":"ContainerDied","Data":"5e80e6297ed4cb6407583fc9a3cefb0d406579b8e762ae1e494844cf4e75d919"} Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.424877 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.458510 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh"] Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.482462 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh"] Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.552515 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mc4q\" (UniqueName: \"kubernetes.io/projected/8fb6eab1-4604-4066-ab41-f102cf79889e-kube-api-access-6mc4q\") pod \"8fb6eab1-4604-4066-ab41-f102cf79889e\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.552595 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-swiftconf\") pod \"8fb6eab1-4604-4066-ab41-f102cf79889e\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.552685 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-ring-data-devices\") pod \"8fb6eab1-4604-4066-ab41-f102cf79889e\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.552706 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-dispersionconf\") pod \"8fb6eab1-4604-4066-ab41-f102cf79889e\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.552745 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-scripts\") pod \"8fb6eab1-4604-4066-ab41-f102cf79889e\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.552792 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6eab1-4604-4066-ab41-f102cf79889e-etc-swift\") pod \"8fb6eab1-4604-4066-ab41-f102cf79889e\" (UID: \"8fb6eab1-4604-4066-ab41-f102cf79889e\") " Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.553531 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb6eab1-4604-4066-ab41-f102cf79889e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8fb6eab1-4604-4066-ab41-f102cf79889e" (UID: "8fb6eab1-4604-4066-ab41-f102cf79889e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.553722 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8fb6eab1-4604-4066-ab41-f102cf79889e" (UID: "8fb6eab1-4604-4066-ab41-f102cf79889e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.574648 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-scripts" (OuterVolumeSpecName: "scripts") pod "8fb6eab1-4604-4066-ab41-f102cf79889e" (UID: "8fb6eab1-4604-4066-ab41-f102cf79889e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.574939 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb6eab1-4604-4066-ab41-f102cf79889e-kube-api-access-6mc4q" (OuterVolumeSpecName: "kube-api-access-6mc4q") pod "8fb6eab1-4604-4066-ab41-f102cf79889e" (UID: "8fb6eab1-4604-4066-ab41-f102cf79889e"). InnerVolumeSpecName "kube-api-access-6mc4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.600972 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8fb6eab1-4604-4066-ab41-f102cf79889e" (UID: "8fb6eab1-4604-4066-ab41-f102cf79889e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.607786 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8fb6eab1-4604-4066-ab41-f102cf79889e" (UID: "8fb6eab1-4604-4066-ab41-f102cf79889e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.654349 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.654398 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb6eab1-4604-4066-ab41-f102cf79889e-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.654414 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mc4q\" (UniqueName: \"kubernetes.io/projected/8fb6eab1-4604-4066-ab41-f102cf79889e-kube-api-access-6mc4q\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.654429 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.654440 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb6eab1-4604-4066-ab41-f102cf79889e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:49 crc kubenswrapper[4732]: I0131 09:20:49.654452 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb6eab1-4604-4066-ab41-f102cf79889e-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.023815 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bbe4227d09313bee386c292dc466bffc18322a4585e9a2d48b5996659f4326c" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.024213 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xdhsh" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.558050 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb6eab1-4604-4066-ab41-f102cf79889e" path="/var/lib/kubelet/pods/8fb6eab1-4604-4066-ab41-f102cf79889e/volumes" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.600475 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk"] Jan 31 09:20:50 crc kubenswrapper[4732]: E0131 09:20:50.600854 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb6eab1-4604-4066-ab41-f102cf79889e" containerName="swift-ring-rebalance" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.600878 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb6eab1-4604-4066-ab41-f102cf79889e" containerName="swift-ring-rebalance" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.601046 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb6eab1-4604-4066-ab41-f102cf79889e" containerName="swift-ring-rebalance" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.601606 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.603942 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.604283 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.620511 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk"] Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.669703 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzddp\" (UniqueName: \"kubernetes.io/projected/fa92affd-0106-4b01-b96c-8f2b0459ee3a-kube-api-access-nzddp\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.669831 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa92affd-0106-4b01-b96c-8f2b0459ee3a-etc-swift\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.669914 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-ring-data-devices\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.669943 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-swiftconf\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.669996 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-scripts\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.670028 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-dispersionconf\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.771867 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-swiftconf\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.772187 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-scripts\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.772215 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-dispersionconf\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.772266 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzddp\" (UniqueName: \"kubernetes.io/projected/fa92affd-0106-4b01-b96c-8f2b0459ee3a-kube-api-access-nzddp\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.772316 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa92affd-0106-4b01-b96c-8f2b0459ee3a-etc-swift\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.772345 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-ring-data-devices\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.772853 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa92affd-0106-4b01-b96c-8f2b0459ee3a-etc-swift\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.773192 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-ring-data-devices\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.773438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-scripts\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.779888 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-dispersionconf\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.784533 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-swiftconf\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.794078 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzddp\" (UniqueName: \"kubernetes.io/projected/fa92affd-0106-4b01-b96c-8f2b0459ee3a-kube-api-access-nzddp\") pod \"swift-ring-rebalance-debug-hwbrk\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:50 crc kubenswrapper[4732]: I0131 09:20:50.931837 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:51 crc kubenswrapper[4732]: I0131 09:20:51.382473 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk"] Jan 31 09:20:51 crc kubenswrapper[4732]: W0131 09:20:51.396195 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa92affd_0106_4b01_b96c_8f2b0459ee3a.slice/crio-bb4c230a2e8424b4f24a0318146a3797047e6098acdddebc6bb41b881f5f6f2c WatchSource:0}: Error finding container bb4c230a2e8424b4f24a0318146a3797047e6098acdddebc6bb41b881f5f6f2c: Status 404 returned error can't find the container with id bb4c230a2e8424b4f24a0318146a3797047e6098acdddebc6bb41b881f5f6f2c Jan 31 09:20:52 crc kubenswrapper[4732]: I0131 09:20:52.040273 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" event={"ID":"fa92affd-0106-4b01-b96c-8f2b0459ee3a","Type":"ContainerStarted","Data":"63aa4ab687f026724734c9f9bad3c34ed0dcd405198734688082bd1c838517d3"} Jan 31 09:20:52 crc kubenswrapper[4732]: I0131 09:20:52.040827 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" event={"ID":"fa92affd-0106-4b01-b96c-8f2b0459ee3a","Type":"ContainerStarted","Data":"bb4c230a2e8424b4f24a0318146a3797047e6098acdddebc6bb41b881f5f6f2c"} Jan 31 09:20:52 crc kubenswrapper[4732]: I0131 09:20:52.058467 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" podStartSLOduration=2.058450885 podStartE2EDuration="2.058450885s" podCreationTimestamp="2026-01-31 09:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:20:52.055956728 +0000 UTC m=+1190.361832942" watchObservedRunningTime="2026-01-31 09:20:52.058450885 +0000 UTC m=+1190.364327089" Jan 31 09:20:53 crc kubenswrapper[4732]: I0131 09:20:53.048344 4732 generic.go:334] "Generic (PLEG): container finished" podID="fa92affd-0106-4b01-b96c-8f2b0459ee3a" containerID="63aa4ab687f026724734c9f9bad3c34ed0dcd405198734688082bd1c838517d3" exitCode=0 Jan 31 09:20:53 crc kubenswrapper[4732]: I0131 09:20:53.048381 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" event={"ID":"fa92affd-0106-4b01-b96c-8f2b0459ee3a","Type":"ContainerDied","Data":"63aa4ab687f026724734c9f9bad3c34ed0dcd405198734688082bd1c838517d3"} Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.423435 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.458834 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk"] Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.469296 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk"] Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.534394 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-swiftconf\") pod \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.534481 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzddp\" (UniqueName: \"kubernetes.io/projected/fa92affd-0106-4b01-b96c-8f2b0459ee3a-kube-api-access-nzddp\") pod \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.534519 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-ring-data-devices\") pod \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.534622 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa92affd-0106-4b01-b96c-8f2b0459ee3a-etc-swift\") pod \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.534753 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-scripts\") pod \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.534876 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-dispersionconf\") pod \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\" (UID: \"fa92affd-0106-4b01-b96c-8f2b0459ee3a\") " Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.537135 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fa92affd-0106-4b01-b96c-8f2b0459ee3a" (UID: "fa92affd-0106-4b01-b96c-8f2b0459ee3a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.537692 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa92affd-0106-4b01-b96c-8f2b0459ee3a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fa92affd-0106-4b01-b96c-8f2b0459ee3a" (UID: "fa92affd-0106-4b01-b96c-8f2b0459ee3a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.541445 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa92affd-0106-4b01-b96c-8f2b0459ee3a-kube-api-access-nzddp" (OuterVolumeSpecName: "kube-api-access-nzddp") pod "fa92affd-0106-4b01-b96c-8f2b0459ee3a" (UID: "fa92affd-0106-4b01-b96c-8f2b0459ee3a"). InnerVolumeSpecName "kube-api-access-nzddp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.562004 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fa92affd-0106-4b01-b96c-8f2b0459ee3a" (UID: "fa92affd-0106-4b01-b96c-8f2b0459ee3a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.576391 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-scripts" (OuterVolumeSpecName: "scripts") pod "fa92affd-0106-4b01-b96c-8f2b0459ee3a" (UID: "fa92affd-0106-4b01-b96c-8f2b0459ee3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.603850 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fa92affd-0106-4b01-b96c-8f2b0459ee3a" (UID: "fa92affd-0106-4b01-b96c-8f2b0459ee3a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.639018 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa92affd-0106-4b01-b96c-8f2b0459ee3a-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.639057 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.639072 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.639085 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa92affd-0106-4b01-b96c-8f2b0459ee3a-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.639099 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzddp\" (UniqueName: \"kubernetes.io/projected/fa92affd-0106-4b01-b96c-8f2b0459ee3a-kube-api-access-nzddp\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.639131 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa92affd-0106-4b01-b96c-8f2b0459ee3a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.642205 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.642250 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.642263 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-7q9kp"] Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.642634 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-server" containerID="cri-o://a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643053 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-server" containerID="cri-o://d31d074c83b389c8db51d7fd48db465e4e9e513ed6084f0c730eaa69444cb6c5" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643336 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="swift-recon-cron" containerID="cri-o://4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643382 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="rsync" containerID="cri-o://9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643420 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-expirer" containerID="cri-o://f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643450 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-updater" containerID="cri-o://22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643482 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-auditor" containerID="cri-o://36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643510 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-replicator" containerID="cri-o://3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643557 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-server" containerID="cri-o://61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643590 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-updater" containerID="cri-o://c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643622 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-auditor" containerID="cri-o://bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643654 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-replicator" containerID="cri-o://c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643704 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-server" containerID="cri-o://498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643734 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-reaper" containerID="cri-o://caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643762 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-auditor" containerID="cri-o://647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.643796 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-replicator" containerID="cri-o://1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644287 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="swift-recon-cron" containerID="cri-o://186ff48385c2286a698108d59672234436dfc7dbb5bac5e777070affa544b217" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644343 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="rsync" containerID="cri-o://f7bb7853880d63c47b634d75defb148b4ed41cf77de6d39f02020df2ff5b03d3" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644382 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-expirer" containerID="cri-o://b50b39b84c6087e4699184efc06a407c325204a64a826bce844fec4c5164858e" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644427 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-updater" containerID="cri-o://3a8ebcfcb039bb39617fcee0e053ba202cf3bccf0891372d90f2647abbe5c1ea" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644462 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-auditor" containerID="cri-o://420022f2e68e290b989b3165de84484bcc24a80cd43dfb94fa6e26433aff9a55" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644495 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-replicator" containerID="cri-o://fd00da0aa47cdd0410c40f7add08c30b0950cfafc21a202b2619d006f368871b" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644527 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-server" containerID="cri-o://78f74f08eb824105edcd5f1f9501b3fa915219d39b763fb391f67d7c1c054292" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644561 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-updater" containerID="cri-o://655a856811f2b949bc2c94495ba00a853818330f4ec09b9aeffd1751288aff79" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644598 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-auditor" containerID="cri-o://61d667d399369be98c22e8ccc9f3b4cb18a2a2b489b34f036b2fc88c0e71e429" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644632 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-replicator" containerID="cri-o://039961fef86b21e927d5ebdf1e2eb67c58775d812f1269411f3f0411c895a43f" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644741 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-server" containerID="cri-o://ada6f8ad69317a784118495c96beb36d701ce7b36209a86b9a936d7cc110e3c2" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.644993 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-reaper" containerID="cri-o://8d252efa39cfce425e89f4670019c9177c3b0c93daee21298b31748b98d341f0" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.645071 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-auditor" containerID="cri-o://6e7852a660c0fcf3225ca272b8e2af7fc11735fb24d2d14460cde6438a7beb2d" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.645109 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-replicator" containerID="cri-o://8f37816afb654d1e776d0d4eec1d440d5af5c0af6bb7163bf1822d00c7129008" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.646875 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647241 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-server" containerID="cri-o://caac1f6da96d02d4cab51830f080cd652323af90c1836570aef955dde0207cff" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647302 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="swift-recon-cron" containerID="cri-o://261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647333 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="rsync" containerID="cri-o://01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647362 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-expirer" containerID="cri-o://caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647392 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-updater" containerID="cri-o://3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647423 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-auditor" containerID="cri-o://a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647453 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-replicator" containerID="cri-o://3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647482 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-server" containerID="cri-o://463912ed98b787656c483ddcf4b73e963647c6d5df261cfe3fad78758d6d1b9d" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647510 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-updater" containerID="cri-o://7ce880a4ac662620fb475dd18995992d32b8be418497d54beba8a09c1335c1aa" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647539 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-auditor" containerID="cri-o://54fada92647ba12febbb920093f9bc8e7464da78204240a5a00d06baf380ab69" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647568 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-replicator" containerID="cri-o://fd1347f500390657be8ce2ee2c537ab8800d5072a87c360ae87e57f1fb6a1882" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647598 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-server" containerID="cri-o://672f5feadfa214c6a0a943e614b3f6be2205caf0cfad789bc48eecd5126064c9" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647627 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-reaper" containerID="cri-o://8fa21222e4cd9c26bf7a8699721788dc5f3bcbad179ebfca9e1ff5dd4f7eca9f" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647673 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-auditor" containerID="cri-o://72926ae802ba11d13ab77c79539bfcc94151936d207f632232dd7d1f41349d6d" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.647705 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-replicator" containerID="cri-o://fa17ab717c5754facbf3da31cb4afd3477236bd426c65c9122642f27b5886fb9" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.651985 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-7q9kp"] Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.671946 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq"] Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.672165 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" podUID="8cb5e63b-882d-4388-abb1-130923832c9f" containerName="proxy-httpd" containerID="cri-o://783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5" gracePeriod=30 Jan 31 09:20:54 crc kubenswrapper[4732]: I0131 09:20:54.672306 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" podUID="8cb5e63b-882d-4388-abb1-130923832c9f" containerName="proxy-server" containerID="cri-o://4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6" gracePeriod=30 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070321 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070356 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070366 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070374 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070380 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070387 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070393 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070400 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070407 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070413 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070421 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070395 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070480 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070494 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070505 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070514 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070524 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070533 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070541 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070550 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070559 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.070567 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.075868 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.075909 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.075921 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.075932 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.075941 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="7ce880a4ac662620fb475dd18995992d32b8be418497d54beba8a09c1335c1aa" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.075930 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.075986 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076000 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076012 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076026 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"7ce880a4ac662620fb475dd18995992d32b8be418497d54beba8a09c1335c1aa"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.075950 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="54fada92647ba12febbb920093f9bc8e7464da78204240a5a00d06baf380ab69" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076048 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="fd1347f500390657be8ce2ee2c537ab8800d5072a87c360ae87e57f1fb6a1882" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076061 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="8fa21222e4cd9c26bf7a8699721788dc5f3bcbad179ebfca9e1ff5dd4f7eca9f" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076069 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="72926ae802ba11d13ab77c79539bfcc94151936d207f632232dd7d1f41349d6d" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076077 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="fa17ab717c5754facbf3da31cb4afd3477236bd426c65c9122642f27b5886fb9" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076037 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"54fada92647ba12febbb920093f9bc8e7464da78204240a5a00d06baf380ab69"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076126 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"fd1347f500390657be8ce2ee2c537ab8800d5072a87c360ae87e57f1fb6a1882"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076138 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"8fa21222e4cd9c26bf7a8699721788dc5f3bcbad179ebfca9e1ff5dd4f7eca9f"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076148 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"72926ae802ba11d13ab77c79539bfcc94151936d207f632232dd7d1f41349d6d"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.076159 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"fa17ab717c5754facbf3da31cb4afd3477236bd426c65c9122642f27b5886fb9"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084260 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="b50b39b84c6087e4699184efc06a407c325204a64a826bce844fec4c5164858e" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084287 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="3a8ebcfcb039bb39617fcee0e053ba202cf3bccf0891372d90f2647abbe5c1ea" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084294 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="420022f2e68e290b989b3165de84484bcc24a80cd43dfb94fa6e26433aff9a55" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084302 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="fd00da0aa47cdd0410c40f7add08c30b0950cfafc21a202b2619d006f368871b" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084313 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="78f74f08eb824105edcd5f1f9501b3fa915219d39b763fb391f67d7c1c054292" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084319 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="655a856811f2b949bc2c94495ba00a853818330f4ec09b9aeffd1751288aff79" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084326 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="61d667d399369be98c22e8ccc9f3b4cb18a2a2b489b34f036b2fc88c0e71e429" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084333 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="039961fef86b21e927d5ebdf1e2eb67c58775d812f1269411f3f0411c895a43f" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084339 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="ada6f8ad69317a784118495c96beb36d701ce7b36209a86b9a936d7cc110e3c2" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084345 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="8d252efa39cfce425e89f4670019c9177c3b0c93daee21298b31748b98d341f0" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084351 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="6e7852a660c0fcf3225ca272b8e2af7fc11735fb24d2d14460cde6438a7beb2d" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084357 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="8f37816afb654d1e776d0d4eec1d440d5af5c0af6bb7163bf1822d00c7129008" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084399 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"b50b39b84c6087e4699184efc06a407c325204a64a826bce844fec4c5164858e"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084426 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"3a8ebcfcb039bb39617fcee0e053ba202cf3bccf0891372d90f2647abbe5c1ea"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084441 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"420022f2e68e290b989b3165de84484bcc24a80cd43dfb94fa6e26433aff9a55"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084453 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"fd00da0aa47cdd0410c40f7add08c30b0950cfafc21a202b2619d006f368871b"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084462 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"78f74f08eb824105edcd5f1f9501b3fa915219d39b763fb391f67d7c1c054292"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084470 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"655a856811f2b949bc2c94495ba00a853818330f4ec09b9aeffd1751288aff79"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084478 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"61d667d399369be98c22e8ccc9f3b4cb18a2a2b489b34f036b2fc88c0e71e429"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084486 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"039961fef86b21e927d5ebdf1e2eb67c58775d812f1269411f3f0411c895a43f"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084494 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"ada6f8ad69317a784118495c96beb36d701ce7b36209a86b9a936d7cc110e3c2"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084502 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"8d252efa39cfce425e89f4670019c9177c3b0c93daee21298b31748b98d341f0"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084512 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"6e7852a660c0fcf3225ca272b8e2af7fc11735fb24d2d14460cde6438a7beb2d"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.084520 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"8f37816afb654d1e776d0d4eec1d440d5af5c0af6bb7163bf1822d00c7129008"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.091553 4732 generic.go:334] "Generic (PLEG): container finished" podID="8cb5e63b-882d-4388-abb1-130923832c9f" containerID="783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5" exitCode=0 Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.091623 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" event={"ID":"8cb5e63b-882d-4388-abb1-130923832c9f","Type":"ContainerDied","Data":"783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5"} Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.093536 4732 scope.go:117] "RemoveContainer" containerID="63aa4ab687f026724734c9f9bad3c34ed0dcd405198734688082bd1c838517d3" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.093582 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hwbrk" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.461030 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.556055 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") pod \"8cb5e63b-882d-4388-abb1-130923832c9f\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.556113 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-run-httpd\") pod \"8cb5e63b-882d-4388-abb1-130923832c9f\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.556140 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-log-httpd\") pod \"8cb5e63b-882d-4388-abb1-130923832c9f\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.556169 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb5e63b-882d-4388-abb1-130923832c9f-config-data\") pod \"8cb5e63b-882d-4388-abb1-130923832c9f\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.556207 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n98z\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-kube-api-access-5n98z\") pod \"8cb5e63b-882d-4388-abb1-130923832c9f\" (UID: \"8cb5e63b-882d-4388-abb1-130923832c9f\") " Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.557521 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8cb5e63b-882d-4388-abb1-130923832c9f" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.557764 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8cb5e63b-882d-4388-abb1-130923832c9f" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.560897 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-kube-api-access-5n98z" (OuterVolumeSpecName: "kube-api-access-5n98z") pod "8cb5e63b-882d-4388-abb1-130923832c9f" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f"). InnerVolumeSpecName "kube-api-access-5n98z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.562276 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8cb5e63b-882d-4388-abb1-130923832c9f" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.605058 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb5e63b-882d-4388-abb1-130923832c9f-config-data" (OuterVolumeSpecName: "config-data") pod "8cb5e63b-882d-4388-abb1-130923832c9f" (UID: "8cb5e63b-882d-4388-abb1-130923832c9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.658262 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.658300 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.658312 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cb5e63b-882d-4388-abb1-130923832c9f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.658325 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cb5e63b-882d-4388-abb1-130923832c9f-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:55 crc kubenswrapper[4732]: I0131 09:20:55.658336 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n98z\" (UniqueName: \"kubernetes.io/projected/8cb5e63b-882d-4388-abb1-130923832c9f-kube-api-access-5n98z\") on node \"crc\" DevicePath \"\"" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.107476 4732 generic.go:334] "Generic (PLEG): container finished" podID="8cb5e63b-882d-4388-abb1-130923832c9f" containerID="4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.107540 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.107575 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" event={"ID":"8cb5e63b-882d-4388-abb1-130923832c9f","Type":"ContainerDied","Data":"4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.108226 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq" event={"ID":"8cb5e63b-882d-4388-abb1-130923832c9f","Type":"ContainerDied","Data":"32df1a9318cd9e4682742a9570f27db68b7c7b206dbaec4cd560f06f827fb57e"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.108261 4732 scope.go:117] "RemoveContainer" containerID="4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.118312 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.118343 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="463912ed98b787656c483ddcf4b73e963647c6d5df261cfe3fad78758d6d1b9d" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.118352 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="672f5feadfa214c6a0a943e614b3f6be2205caf0cfad789bc48eecd5126064c9" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.118360 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="caac1f6da96d02d4cab51830f080cd652323af90c1836570aef955dde0207cff" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.118357 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.118423 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"463912ed98b787656c483ddcf4b73e963647c6d5df261cfe3fad78758d6d1b9d"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.118460 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"672f5feadfa214c6a0a943e614b3f6be2205caf0cfad789bc48eecd5126064c9"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.118481 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"caac1f6da96d02d4cab51830f080cd652323af90c1836570aef955dde0207cff"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.130336 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.130366 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.130373 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.130425 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.130565 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.130972 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.135212 4732 scope.go:117] "RemoveContainer" containerID="783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.147153 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="f7bb7853880d63c47b634d75defb148b4ed41cf77de6d39f02020df2ff5b03d3" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.147183 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="d31d074c83b389c8db51d7fd48db465e4e9e513ed6084f0c730eaa69444cb6c5" exitCode=0 Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.147211 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"f7bb7853880d63c47b634d75defb148b4ed41cf77de6d39f02020df2ff5b03d3"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.147241 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"d31d074c83b389c8db51d7fd48db465e4e9e513ed6084f0c730eaa69444cb6c5"} Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.150545 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq"] Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.159111 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-7d8cf99555-lvwdq"] Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.163813 4732 scope.go:117] "RemoveContainer" containerID="4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6" Jan 31 09:20:56 crc kubenswrapper[4732]: E0131 09:20:56.164379 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6\": container with ID starting with 4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6 not found: ID does not exist" containerID="4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.164437 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6"} err="failed to get container status \"4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6\": rpc error: code = NotFound desc = could not find container \"4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6\": container with ID starting with 4835492a2361cd167a0d1bc3cdf3c2b8f82043e40e4867b9666ee8ab6a6362e6 not found: ID does not exist" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.164470 4732 scope.go:117] "RemoveContainer" containerID="783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5" Jan 31 09:20:56 crc kubenswrapper[4732]: E0131 09:20:56.165123 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5\": container with ID starting with 783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5 not found: ID does not exist" containerID="783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.165151 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5"} err="failed to get container status \"783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5\": rpc error: code = NotFound desc = could not find container \"783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5\": container with ID starting with 783febaf4bf86ebe7b225017d95af46ce165aac17683180fbdfbb6b8367cdec5 not found: ID does not exist" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.555748 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb5e63b-882d-4388-abb1-130923832c9f" path="/var/lib/kubelet/pods/8cb5e63b-882d-4388-abb1-130923832c9f/volumes" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.557993 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e25bc49e-1bbe-4103-b751-fee5d86e7a92" path="/var/lib/kubelet/pods/e25bc49e-1bbe-4103-b751-fee5d86e7a92/volumes" Jan 31 09:20:56 crc kubenswrapper[4732]: I0131 09:20:56.558870 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa92affd-0106-4b01-b96c-8f2b0459ee3a" path="/var/lib/kubelet/pods/fa92affd-0106-4b01-b96c-8f2b0459ee3a/volumes" Jan 31 09:21:24 crc kubenswrapper[4732]: E0131 09:21:24.906988 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea3117d7_0038_4ca5_bee5_ae76db9a12eb.slice/crio-conmon-186ff48385c2286a698108d59672234436dfc7dbb5bac5e777070affa544b217.scope\": RecentStats: unable to find data in memory cache]" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.128949 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.143212 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.145832 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249201 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249273 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") pod \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249332 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-cache\") pod \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249364 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txhcm\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-kube-api-access-txhcm\") pod \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249388 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") pod \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249414 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zdfg\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-kube-api-access-9zdfg\") pod \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249446 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249483 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-lock\") pod \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249510 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-lock\") pod \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249609 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqbp6\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-kube-api-access-pqbp6\") pod \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249637 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249681 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-lock\") pod \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249716 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-cache\") pod \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\" (UID: \"18b68f5e-a1b4-4f52-9a4e-5967735ec105\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249736 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") pod \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\" (UID: \"eb04e24b-fc92-4f2e-abcb-fa46706f699a\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249777 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-cache\") pod \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\" (UID: \"ea3117d7-0038-4ca5-bee5-ae76db9a12eb\") " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.249951 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-cache" (OuterVolumeSpecName: "cache") pod "eb04e24b-fc92-4f2e-abcb-fa46706f699a" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.250124 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-cache\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.250441 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-lock" (OuterVolumeSpecName: "lock") pod "eb04e24b-fc92-4f2e-abcb-fa46706f699a" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.250906 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-cache" (OuterVolumeSpecName: "cache") pod "18b68f5e-a1b4-4f52-9a4e-5967735ec105" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.250933 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-lock" (OuterVolumeSpecName: "lock") pod "ea3117d7-0038-4ca5-bee5-ae76db9a12eb" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.262937 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-cache" (OuterVolumeSpecName: "cache") pod "ea3117d7-0038-4ca5-bee5-ae76db9a12eb" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.263806 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-lock" (OuterVolumeSpecName: "lock") pod "18b68f5e-a1b4-4f52-9a4e-5967735ec105" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.271339 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "swift") pod "eb04e24b-fc92-4f2e-abcb-fa46706f699a" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.271364 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "18b68f5e-a1b4-4f52-9a4e-5967735ec105" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.271509 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-kube-api-access-pqbp6" (OuterVolumeSpecName: "kube-api-access-pqbp6") pod "eb04e24b-fc92-4f2e-abcb-fa46706f699a" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a"). InnerVolumeSpecName "kube-api-access-pqbp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.271946 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "18b68f5e-a1b4-4f52-9a4e-5967735ec105" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.271568 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "swift") pod "ea3117d7-0038-4ca5-bee5-ae76db9a12eb" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.271800 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-kube-api-access-txhcm" (OuterVolumeSpecName: "kube-api-access-txhcm") pod "18b68f5e-a1b4-4f52-9a4e-5967735ec105" (UID: "18b68f5e-a1b4-4f52-9a4e-5967735ec105"). InnerVolumeSpecName "kube-api-access-txhcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.272806 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-kube-api-access-9zdfg" (OuterVolumeSpecName: "kube-api-access-9zdfg") pod "ea3117d7-0038-4ca5-bee5-ae76db9a12eb" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb"). InnerVolumeSpecName "kube-api-access-9zdfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.276987 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ea3117d7-0038-4ca5-bee5-ae76db9a12eb" (UID: "ea3117d7-0038-4ca5-bee5-ae76db9a12eb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.279307 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "eb04e24b-fc92-4f2e-abcb-fa46706f699a" (UID: "eb04e24b-fc92-4f2e-abcb-fa46706f699a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350725 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350758 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/eb04e24b-fc92-4f2e-abcb-fa46706f699a-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350772 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqbp6\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-kube-api-access-pqbp6\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350810 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350823 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350833 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/18b68f5e-a1b4-4f52-9a4e-5967735ec105-cache\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350844 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/eb04e24b-fc92-4f2e-abcb-fa46706f699a-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350854 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-cache\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350875 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350885 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350893 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txhcm\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-kube-api-access-txhcm\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350902 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/18b68f5e-a1b4-4f52-9a4e-5967735ec105-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350910 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zdfg\" (UniqueName: \"kubernetes.io/projected/ea3117d7-0038-4ca5-bee5-ae76db9a12eb-kube-api-access-9zdfg\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.350927 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.362825 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.366769 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.375231 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.418643 4732 generic.go:334] "Generic (PLEG): container finished" podID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerID="4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e" exitCode=137 Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.418819 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.418873 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"eb04e24b-fc92-4f2e-abcb-fa46706f699a","Type":"ContainerDied","Data":"484b2ea8fc5f3e2a8486b0977aa570fbe958192eb24867f074d00e930ab4b1a3"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.418868 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.418910 4732 scope.go:117] "RemoveContainer" containerID="4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427151 4732 generic.go:334] "Generic (PLEG): container finished" podID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerID="261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828" exitCode=137 Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427209 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427260 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"18b68f5e-a1b4-4f52-9a4e-5967735ec105","Type":"ContainerDied","Data":"9213fe58caadec775e92ed28d6bba2a8ef3c6d5b541e8e1e305d2da7b48f0916"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427275 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"463912ed98b787656c483ddcf4b73e963647c6d5df261cfe3fad78758d6d1b9d"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427287 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7ce880a4ac662620fb475dd18995992d32b8be418497d54beba8a09c1335c1aa"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427294 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"54fada92647ba12febbb920093f9bc8e7464da78204240a5a00d06baf380ab69"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427302 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd1347f500390657be8ce2ee2c537ab8800d5072a87c360ae87e57f1fb6a1882"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427308 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"672f5feadfa214c6a0a943e614b3f6be2205caf0cfad789bc48eecd5126064c9"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427315 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8fa21222e4cd9c26bf7a8699721788dc5f3bcbad179ebfca9e1ff5dd4f7eca9f"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427317 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427321 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"72926ae802ba11d13ab77c79539bfcc94151936d207f632232dd7d1f41349d6d"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427795 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa17ab717c5754facbf3da31cb4afd3477236bd426c65c9122642f27b5886fb9"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.427811 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"caac1f6da96d02d4cab51830f080cd652323af90c1836570aef955dde0207cff"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439647 4732 generic.go:334] "Generic (PLEG): container finished" podID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerID="186ff48385c2286a698108d59672234436dfc7dbb5bac5e777070affa544b217" exitCode=137 Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439700 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"186ff48385c2286a698108d59672234436dfc7dbb5bac5e777070affa544b217"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439725 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"186ff48385c2286a698108d59672234436dfc7dbb5bac5e777070affa544b217"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439738 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f7bb7853880d63c47b634d75defb148b4ed41cf77de6d39f02020df2ff5b03d3"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439745 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b50b39b84c6087e4699184efc06a407c325204a64a826bce844fec4c5164858e"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439752 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a8ebcfcb039bb39617fcee0e053ba202cf3bccf0891372d90f2647abbe5c1ea"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439758 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"420022f2e68e290b989b3165de84484bcc24a80cd43dfb94fa6e26433aff9a55"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439765 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd00da0aa47cdd0410c40f7add08c30b0950cfafc21a202b2619d006f368871b"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439772 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78f74f08eb824105edcd5f1f9501b3fa915219d39b763fb391f67d7c1c054292"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439779 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"655a856811f2b949bc2c94495ba00a853818330f4ec09b9aeffd1751288aff79"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439786 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61d667d399369be98c22e8ccc9f3b4cb18a2a2b489b34f036b2fc88c0e71e429"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439794 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"039961fef86b21e927d5ebdf1e2eb67c58775d812f1269411f3f0411c895a43f"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439800 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ada6f8ad69317a784118495c96beb36d701ce7b36209a86b9a936d7cc110e3c2"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439806 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d252efa39cfce425e89f4670019c9177c3b0c93daee21298b31748b98d341f0"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439877 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e7852a660c0fcf3225ca272b8e2af7fc11735fb24d2d14460cde6438a7beb2d"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439832 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439898 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f37816afb654d1e776d0d4eec1d440d5af5c0af6bb7163bf1822d00c7129008"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439952 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d31d074c83b389c8db51d7fd48db465e4e9e513ed6084f0c730eaa69444cb6c5"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439969 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ea3117d7-0038-4ca5-bee5-ae76db9a12eb","Type":"ContainerDied","Data":"6568c920e3e41f0ca77451bc255f08043b793e9b23cf5a92a349e3e4e75234e1"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.439996 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"186ff48385c2286a698108d59672234436dfc7dbb5bac5e777070affa544b217"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440005 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f7bb7853880d63c47b634d75defb148b4ed41cf77de6d39f02020df2ff5b03d3"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440012 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b50b39b84c6087e4699184efc06a407c325204a64a826bce844fec4c5164858e"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440019 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a8ebcfcb039bb39617fcee0e053ba202cf3bccf0891372d90f2647abbe5c1ea"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440025 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"420022f2e68e290b989b3165de84484bcc24a80cd43dfb94fa6e26433aff9a55"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440032 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd00da0aa47cdd0410c40f7add08c30b0950cfafc21a202b2619d006f368871b"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440037 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78f74f08eb824105edcd5f1f9501b3fa915219d39b763fb391f67d7c1c054292"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440044 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"655a856811f2b949bc2c94495ba00a853818330f4ec09b9aeffd1751288aff79"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440050 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"61d667d399369be98c22e8ccc9f3b4cb18a2a2b489b34f036b2fc88c0e71e429"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440057 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"039961fef86b21e927d5ebdf1e2eb67c58775d812f1269411f3f0411c895a43f"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440063 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ada6f8ad69317a784118495c96beb36d701ce7b36209a86b9a936d7cc110e3c2"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440070 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8d252efa39cfce425e89f4670019c9177c3b0c93daee21298b31748b98d341f0"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440077 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6e7852a660c0fcf3225ca272b8e2af7fc11735fb24d2d14460cde6438a7beb2d"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440082 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f37816afb654d1e776d0d4eec1d440d5af5c0af6bb7163bf1822d00c7129008"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.440087 4732 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d31d074c83b389c8db51d7fd48db465e4e9e513ed6084f0c730eaa69444cb6c5"} Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.443762 4732 scope.go:117] "RemoveContainer" containerID="9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.453417 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.453448 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.453459 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.459365 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.466083 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.467840 4732 scope.go:117] "RemoveContainer" containerID="f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.489282 4732 scope.go:117] "RemoveContainer" containerID="22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.494717 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.505221 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.510768 4732 scope.go:117] "RemoveContainer" containerID="36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.511554 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.517254 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.528949 4732 scope.go:117] "RemoveContainer" containerID="3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.547256 4732 scope.go:117] "RemoveContainer" containerID="61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.560868 4732 scope.go:117] "RemoveContainer" containerID="c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.577784 4732 scope.go:117] "RemoveContainer" containerID="bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.592312 4732 scope.go:117] "RemoveContainer" containerID="c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.607705 4732 scope.go:117] "RemoveContainer" containerID="498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.621350 4732 scope.go:117] "RemoveContainer" containerID="caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.639400 4732 scope.go:117] "RemoveContainer" containerID="647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.654007 4732 scope.go:117] "RemoveContainer" containerID="1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.666373 4732 scope.go:117] "RemoveContainer" containerID="a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.680152 4732 scope.go:117] "RemoveContainer" containerID="4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.680468 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e\": container with ID starting with 4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e not found: ID does not exist" containerID="4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.680502 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e"} err="failed to get container status \"4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e\": rpc error: code = NotFound desc = could not find container \"4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e\": container with ID starting with 4591c4a657bf0f19e528acc7cb5586ad0a5dffe6072302f778c6b3daec4b890e not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.680526 4732 scope.go:117] "RemoveContainer" containerID="9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.680916 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9\": container with ID starting with 9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9 not found: ID does not exist" containerID="9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.680946 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9"} err="failed to get container status \"9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9\": rpc error: code = NotFound desc = could not find container \"9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9\": container with ID starting with 9bd8439c812d4db4031eaf201411d9f03f68a7e4fb620cd284c1316ed57ea8e9 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.680965 4732 scope.go:117] "RemoveContainer" containerID="f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.681153 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f\": container with ID starting with f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f not found: ID does not exist" containerID="f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.681187 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f"} err="failed to get container status \"f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f\": rpc error: code = NotFound desc = could not find container \"f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f\": container with ID starting with f7740f3734b2a6d8e15826537feeb63febadea03ccc4bc0d6f70700f6a75626f not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.681240 4732 scope.go:117] "RemoveContainer" containerID="22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.681554 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d\": container with ID starting with 22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d not found: ID does not exist" containerID="22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.681641 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d"} err="failed to get container status \"22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d\": rpc error: code = NotFound desc = could not find container \"22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d\": container with ID starting with 22026d373e04df27564ec7eb67c62bdb36d5f926aefc6470f05958ca3bfafb6d not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.681728 4732 scope.go:117] "RemoveContainer" containerID="36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.681993 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb\": container with ID starting with 36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb not found: ID does not exist" containerID="36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.682020 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb"} err="failed to get container status \"36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb\": rpc error: code = NotFound desc = could not find container \"36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb\": container with ID starting with 36c3a43f98d1be3fa4067ddd849e1f8adb09f6830fb90e1e103b4fee901b5ddb not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.682037 4732 scope.go:117] "RemoveContainer" containerID="3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.682230 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f\": container with ID starting with 3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f not found: ID does not exist" containerID="3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.682258 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f"} err="failed to get container status \"3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f\": rpc error: code = NotFound desc = could not find container \"3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f\": container with ID starting with 3bedc7b7578ebe9948019c2fe597f44f31339a72681e217b76908ee8888b902f not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.682274 4732 scope.go:117] "RemoveContainer" containerID="61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.682480 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521\": container with ID starting with 61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521 not found: ID does not exist" containerID="61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.682507 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521"} err="failed to get container status \"61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521\": rpc error: code = NotFound desc = could not find container \"61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521\": container with ID starting with 61eb8b513a113fbabe0d9a8fc9c197f8afac97222ddfaaa712ed87f070443521 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.682525 4732 scope.go:117] "RemoveContainer" containerID="c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.682855 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7\": container with ID starting with c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7 not found: ID does not exist" containerID="c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.682895 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7"} err="failed to get container status \"c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7\": rpc error: code = NotFound desc = could not find container \"c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7\": container with ID starting with c9a37bdf8b63464e6bd6756605e3b1a2408e5416d00cda661d4d274b38bd7be7 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.682952 4732 scope.go:117] "RemoveContainer" containerID="bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.683234 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8\": container with ID starting with bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8 not found: ID does not exist" containerID="bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.683262 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8"} err="failed to get container status \"bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8\": rpc error: code = NotFound desc = could not find container \"bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8\": container with ID starting with bde9eaf63bb5ab51e11bacddaba56989bae7d6031fa19e2c1b77f92866dadea8 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.683281 4732 scope.go:117] "RemoveContainer" containerID="c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.683474 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd\": container with ID starting with c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd not found: ID does not exist" containerID="c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.683504 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd"} err="failed to get container status \"c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd\": rpc error: code = NotFound desc = could not find container \"c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd\": container with ID starting with c9736add9e20e2e6d821daa6edc69baaef51fa7d36762e678ee4128cf4df9dbd not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.683523 4732 scope.go:117] "RemoveContainer" containerID="498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.683772 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab\": container with ID starting with 498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab not found: ID does not exist" containerID="498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.683847 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab"} err="failed to get container status \"498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab\": rpc error: code = NotFound desc = could not find container \"498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab\": container with ID starting with 498871caf245c1aa003224c568a9512c564fb67d15571cad0c2fa636e92058ab not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.683904 4732 scope.go:117] "RemoveContainer" containerID="caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.684155 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6\": container with ID starting with caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6 not found: ID does not exist" containerID="caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.684183 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6"} err="failed to get container status \"caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6\": rpc error: code = NotFound desc = could not find container \"caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6\": container with ID starting with caafb470bfca795c3c7f1107e8a9e97700d33c459a35140d551bfc62e596d5a6 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.684199 4732 scope.go:117] "RemoveContainer" containerID="647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.684474 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c\": container with ID starting with 647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c not found: ID does not exist" containerID="647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.684512 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c"} err="failed to get container status \"647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c\": rpc error: code = NotFound desc = could not find container \"647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c\": container with ID starting with 647e5cb59a1fa496ac972ece6ee175c218e028f20f65b8bd02f6b5c3d914e53c not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.684586 4732 scope.go:117] "RemoveContainer" containerID="1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.685922 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178\": container with ID starting with 1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178 not found: ID does not exist" containerID="1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.685950 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178"} err="failed to get container status \"1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178\": rpc error: code = NotFound desc = could not find container \"1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178\": container with ID starting with 1a43f64c141cb300a8ec9eb5a477bb5c1933446a32abc193cb5173f92ba21178 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.685971 4732 scope.go:117] "RemoveContainer" containerID="a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.686161 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c\": container with ID starting with a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c not found: ID does not exist" containerID="a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.686185 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c"} err="failed to get container status \"a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c\": rpc error: code = NotFound desc = could not find container \"a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c\": container with ID starting with a00d79844fd0811e549149a48676d54976a695ed7a1497b1125e383f358fdc3c not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.686200 4732 scope.go:117] "RemoveContainer" containerID="261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.704490 4732 scope.go:117] "RemoveContainer" containerID="01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.720123 4732 scope.go:117] "RemoveContainer" containerID="caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.737404 4732 scope.go:117] "RemoveContainer" containerID="3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.772353 4732 scope.go:117] "RemoveContainer" containerID="a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.785539 4732 scope.go:117] "RemoveContainer" containerID="3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.802030 4732 scope.go:117] "RemoveContainer" containerID="463912ed98b787656c483ddcf4b73e963647c6d5df261cfe3fad78758d6d1b9d" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.820316 4732 scope.go:117] "RemoveContainer" containerID="7ce880a4ac662620fb475dd18995992d32b8be418497d54beba8a09c1335c1aa" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.833523 4732 scope.go:117] "RemoveContainer" containerID="54fada92647ba12febbb920093f9bc8e7464da78204240a5a00d06baf380ab69" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.847543 4732 scope.go:117] "RemoveContainer" containerID="fd1347f500390657be8ce2ee2c537ab8800d5072a87c360ae87e57f1fb6a1882" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.862161 4732 scope.go:117] "RemoveContainer" containerID="672f5feadfa214c6a0a943e614b3f6be2205caf0cfad789bc48eecd5126064c9" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.881779 4732 scope.go:117] "RemoveContainer" containerID="8fa21222e4cd9c26bf7a8699721788dc5f3bcbad179ebfca9e1ff5dd4f7eca9f" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.903437 4732 scope.go:117] "RemoveContainer" containerID="72926ae802ba11d13ab77c79539bfcc94151936d207f632232dd7d1f41349d6d" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.926709 4732 scope.go:117] "RemoveContainer" containerID="fa17ab717c5754facbf3da31cb4afd3477236bd426c65c9122642f27b5886fb9" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.944872 4732 scope.go:117] "RemoveContainer" containerID="caac1f6da96d02d4cab51830f080cd652323af90c1836570aef955dde0207cff" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.960033 4732 scope.go:117] "RemoveContainer" containerID="261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.960385 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828\": container with ID starting with 261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828 not found: ID does not exist" containerID="261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.960420 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828"} err="failed to get container status \"261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828\": rpc error: code = NotFound desc = could not find container \"261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828\": container with ID starting with 261d59a3c1b5662c1245d9e0a26d6fdf5a756126f782cf23384b915262209828 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.960443 4732 scope.go:117] "RemoveContainer" containerID="01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.960851 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172\": container with ID starting with 01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172 not found: ID does not exist" containerID="01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.960875 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172"} err="failed to get container status \"01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172\": rpc error: code = NotFound desc = could not find container \"01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172\": container with ID starting with 01d34a0e2eb5b105a83439a771538234555a443b0c13ee4cb087d83ae6e5a172 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.960890 4732 scope.go:117] "RemoveContainer" containerID="caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.961151 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1\": container with ID starting with caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1 not found: ID does not exist" containerID="caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.961173 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1"} err="failed to get container status \"caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1\": rpc error: code = NotFound desc = could not find container \"caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1\": container with ID starting with caaaa31e0459d9ecc2e8474a50fcbdb26784a11af19cecd7546e523bc30b1bb1 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.961188 4732 scope.go:117] "RemoveContainer" containerID="3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.961575 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1\": container with ID starting with 3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1 not found: ID does not exist" containerID="3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.961641 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1"} err="failed to get container status \"3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1\": rpc error: code = NotFound desc = could not find container \"3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1\": container with ID starting with 3c487459ef32558f2157dbf1c508505bc4dd091b7590c38ba37b19a62a7f80b1 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.961706 4732 scope.go:117] "RemoveContainer" containerID="a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.961977 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9\": container with ID starting with a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9 not found: ID does not exist" containerID="a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.961999 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9"} err="failed to get container status \"a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9\": rpc error: code = NotFound desc = could not find container \"a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9\": container with ID starting with a8e090887d88b45d44e7f952ad72621d08fce059d8e6698d9ad0a38d3082acc9 not found: ID does not exist" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.962014 4732 scope.go:117] "RemoveContainer" containerID="3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016" Jan 31 09:21:25 crc kubenswrapper[4732]: E0131 09:21:25.962469 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016\": container with ID starting with 3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016 not found: ID does not exist" containerID="3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016" Jan 31 09:21:25 crc kubenswrapper[4732]: I0131 09:21:25.962493 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016"} err="failed to get container status \"3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016\": rpc error: code = NotFound desc = could not find container \"3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016\": container with ID starting with 3b885dbb98c02c11a228e346c436a702bd2867b63763347fde6a9f9094c1d016 not found: ID does not exist" Jan 31 09:21:26 crc kubenswrapper[4732]: I0131 09:21:26.551064 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" path="/var/lib/kubelet/pods/18b68f5e-a1b4-4f52-9a4e-5967735ec105/volumes" Jan 31 09:21:26 crc kubenswrapper[4732]: I0131 09:21:26.553297 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" path="/var/lib/kubelet/pods/ea3117d7-0038-4ca5-bee5-ae76db9a12eb/volumes" Jan 31 09:21:26 crc kubenswrapper[4732]: I0131 09:21:26.555107 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" path="/var/lib/kubelet/pods/eb04e24b-fc92-4f2e-abcb-fa46706f699a/volumes" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.128304 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.128856 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.128873 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.128885 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.128892 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.128909 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.128915 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.128927 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-reaper" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.128935 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-reaper" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.128948 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.128955 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.128964 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-expirer" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.128972 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-expirer" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.128985 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.128992 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129001 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129007 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129018 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129025 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129035 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129041 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129053 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129060 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129071 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="swift-recon-cron" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129078 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="swift-recon-cron" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129089 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-reaper" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129095 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-reaper" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129102 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129108 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129114 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="rsync" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129120 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="rsync" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129128 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb5e63b-882d-4388-abb1-130923832c9f" containerName="proxy-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129134 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb5e63b-882d-4388-abb1-130923832c9f" containerName="proxy-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129142 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-reaper" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129148 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-reaper" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129158 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129164 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129172 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb5e63b-882d-4388-abb1-130923832c9f" containerName="proxy-httpd" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129178 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb5e63b-882d-4388-abb1-130923832c9f" containerName="proxy-httpd" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129185 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129190 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129200 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129206 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129216 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129222 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129228 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa92affd-0106-4b01-b96c-8f2b0459ee3a" containerName="swift-ring-rebalance" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129233 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa92affd-0106-4b01-b96c-8f2b0459ee3a" containerName="swift-ring-rebalance" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129241 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129246 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129255 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129260 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129268 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129274 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129282 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129287 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129295 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="rsync" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129300 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="rsync" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129309 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129314 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129325 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129331 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129340 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129346 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129353 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129359 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129368 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129376 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129388 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129399 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129408 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129413 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129421 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129426 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129435 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129453 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129459 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129465 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129474 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129480 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129487 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129493 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129506 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129517 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129527 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129534 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129544 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="swift-recon-cron" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129552 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="swift-recon-cron" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129566 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="rsync" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129574 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="rsync" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129582 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="swift-recon-cron" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129587 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="swift-recon-cron" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129598 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-expirer" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129603 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-expirer" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129613 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129618 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-server" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.129626 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-expirer" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129631 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-expirer" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129815 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129824 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129829 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129837 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="rsync" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129843 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129850 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129857 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129865 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129873 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129879 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="swift-recon-cron" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129892 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129898 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129907 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb5e63b-882d-4388-abb1-130923832c9f" containerName="proxy-httpd" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129915 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb5e63b-882d-4388-abb1-130923832c9f" containerName="proxy-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129924 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129933 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="rsync" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129941 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129946 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129952 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129957 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129967 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129975 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa92affd-0106-4b01-b96c-8f2b0459ee3a" containerName="swift-ring-rebalance" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129982 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129987 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.129995 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-expirer" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130004 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130011 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="container-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130020 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-reaper" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130028 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130034 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="container-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130042 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-expirer" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130051 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130056 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="rsync" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130063 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130071 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="swift-recon-cron" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130080 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-reaper" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130085 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-updater" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130091 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="account-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130101 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130108 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-reaper" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130116 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="object-server" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130124 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea3117d7-0038-4ca5-bee5-ae76db9a12eb" containerName="container-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130130 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="swift-recon-cron" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130136 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="account-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130142 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-replicator" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130150 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="object-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130159 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb04e24b-fc92-4f2e-abcb-fa46706f699a" containerName="account-auditor" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.130166 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b68f5e-a1b4-4f52-9a4e-5967735ec105" containerName="object-expirer" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.138791 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.144356 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.146126 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-z8hn5" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.146404 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.146477 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.154029 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w"] Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.157344 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.159648 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.173143 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w"] Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.178567 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297377 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-run-httpd\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297424 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5994e0-d411-4d47-bcbb-1a12020906ce-config-data\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297444 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-cache\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297462 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qccj\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-kube-api-access-8qccj\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297617 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297694 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-lock\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297763 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njt7j\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-kube-api-access-njt7j\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297790 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297826 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.297840 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-log-httpd\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399347 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399410 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-lock\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399447 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njt7j\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-kube-api-access-njt7j\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399465 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399487 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399502 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-log-httpd\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399534 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-run-httpd\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399557 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5994e0-d411-4d47-bcbb-1a12020906ce-config-data\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399571 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-cache\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.399588 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qccj\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-kube-api-access-8qccj\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.399936 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.399948 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w: configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.399983 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift podName:4d5994e0-d411-4d47-bcbb-1a12020906ce nodeName:}" failed. No retries permitted until 2026-01-31 09:21:28.899967431 +0000 UTC m=+1227.205843635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift") pod "swift-proxy-77c98d654c-ftt6w" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce") : configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.400393 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.400434 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.400503 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift podName:2d99525a-cb49-44dd-82c0-0bf1641ec2b5 nodeName:}" failed. No retries permitted until 2026-01-31 09:21:28.900481797 +0000 UTC m=+1227.206358071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift") pod "swift-storage-0" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5") : configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.400726 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-cache\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.400764 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-lock\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.400798 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-run-httpd\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.400931 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-log-httpd\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.400956 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") device mount path \"/mnt/openstack/pv06\"" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.420446 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njt7j\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-kube-api-access-njt7j\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.422604 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5994e0-d411-4d47-bcbb-1a12020906ce-config-data\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.422766 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.431800 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qccj\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-kube-api-access-8qccj\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.906233 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:28 crc kubenswrapper[4732]: I0131 09:21:28.906335 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.906433 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.906473 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w: configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.906514 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.906536 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.906548 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift podName:4d5994e0-d411-4d47-bcbb-1a12020906ce nodeName:}" failed. No retries permitted until 2026-01-31 09:21:29.906521788 +0000 UTC m=+1228.212398032 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift") pod "swift-proxy-77c98d654c-ftt6w" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce") : configmap "swift-ring-files" not found Jan 31 09:21:28 crc kubenswrapper[4732]: E0131 09:21:28.906597 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift podName:2d99525a-cb49-44dd-82c0-0bf1641ec2b5 nodeName:}" failed. No retries permitted until 2026-01-31 09:21:29.906579279 +0000 UTC m=+1228.212455493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift") pod "swift-storage-0" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5") : configmap "swift-ring-files" not found Jan 31 09:21:29 crc kubenswrapper[4732]: I0131 09:21:29.922023 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:29 crc kubenswrapper[4732]: I0131 09:21:29.923029 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:29 crc kubenswrapper[4732]: E0131 09:21:29.922234 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:29 crc kubenswrapper[4732]: E0131 09:21:29.923292 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w: configmap "swift-ring-files" not found Jan 31 09:21:29 crc kubenswrapper[4732]: E0131 09:21:29.923340 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift podName:4d5994e0-d411-4d47-bcbb-1a12020906ce nodeName:}" failed. No retries permitted until 2026-01-31 09:21:31.923325575 +0000 UTC m=+1230.229201779 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift") pod "swift-proxy-77c98d654c-ftt6w" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce") : configmap "swift-ring-files" not found Jan 31 09:21:29 crc kubenswrapper[4732]: E0131 09:21:29.923278 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:29 crc kubenswrapper[4732]: E0131 09:21:29.923648 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:21:29 crc kubenswrapper[4732]: E0131 09:21:29.923704 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift podName:2d99525a-cb49-44dd-82c0-0bf1641ec2b5 nodeName:}" failed. No retries permitted until 2026-01-31 09:21:31.923693557 +0000 UTC m=+1230.229569761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift") pod "swift-storage-0" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5") : configmap "swift-ring-files" not found Jan 31 09:21:31 crc kubenswrapper[4732]: I0131 09:21:31.951402 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:31 crc kubenswrapper[4732]: I0131 09:21:31.951826 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:31 crc kubenswrapper[4732]: E0131 09:21:31.951604 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:31 crc kubenswrapper[4732]: E0131 09:21:31.951878 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w: configmap "swift-ring-files" not found Jan 31 09:21:31 crc kubenswrapper[4732]: E0131 09:21:31.951944 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift podName:4d5994e0-d411-4d47-bcbb-1a12020906ce nodeName:}" failed. No retries permitted until 2026-01-31 09:21:35.951916246 +0000 UTC m=+1234.257792450 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift") pod "swift-proxy-77c98d654c-ftt6w" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce") : configmap "swift-ring-files" not found Jan 31 09:21:31 crc kubenswrapper[4732]: E0131 09:21:31.951946 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:31 crc kubenswrapper[4732]: E0131 09:21:31.951962 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:21:31 crc kubenswrapper[4732]: E0131 09:21:31.951993 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift podName:2d99525a-cb49-44dd-82c0-0bf1641ec2b5 nodeName:}" failed. No retries permitted until 2026-01-31 09:21:35.951978728 +0000 UTC m=+1234.257854932 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift") pod "swift-storage-0" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5") : configmap "swift-ring-files" not found Jan 31 09:21:31 crc kubenswrapper[4732]: I0131 09:21:31.976793 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-vj28j"] Jan 31 09:21:31 crc kubenswrapper[4732]: I0131 09:21:31.977658 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:31 crc kubenswrapper[4732]: I0131 09:21:31.981037 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:21:31 crc kubenswrapper[4732]: I0131 09:21:31.981044 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:21:31 crc kubenswrapper[4732]: I0131 09:21:31.986848 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-vj28j"] Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.156021 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-ring-data-devices\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.156095 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4957\" (UniqueName: \"kubernetes.io/projected/0c80b3a6-8701-4276-a6a2-80913e60ea9a-kube-api-access-t4957\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.156225 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c80b3a6-8701-4276-a6a2-80913e60ea9a-etc-swift\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.156295 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-scripts\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.156544 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-dispersionconf\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.156623 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-swiftconf\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.258545 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c80b3a6-8701-4276-a6a2-80913e60ea9a-etc-swift\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.258764 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-scripts\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.258837 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-dispersionconf\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.258864 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-swiftconf\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.258940 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-ring-data-devices\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.258976 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4957\" (UniqueName: \"kubernetes.io/projected/0c80b3a6-8701-4276-a6a2-80913e60ea9a-kube-api-access-t4957\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.259164 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c80b3a6-8701-4276-a6a2-80913e60ea9a-etc-swift\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.260728 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-scripts\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.260983 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-ring-data-devices\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.265867 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-dispersionconf\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.267727 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-swiftconf\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.278323 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4957\" (UniqueName: \"kubernetes.io/projected/0c80b3a6-8701-4276-a6a2-80913e60ea9a-kube-api-access-t4957\") pod \"swift-ring-rebalance-vj28j\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.295645 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:32 crc kubenswrapper[4732]: I0131 09:21:32.705150 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-vj28j"] Jan 31 09:21:32 crc kubenswrapper[4732]: W0131 09:21:32.723051 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c80b3a6_8701_4276_a6a2_80913e60ea9a.slice/crio-b5257e67474cd986634f02ff3791244e4413c3901b1c0a424a805eb7a793b505 WatchSource:0}: Error finding container b5257e67474cd986634f02ff3791244e4413c3901b1c0a424a805eb7a793b505: Status 404 returned error can't find the container with id b5257e67474cd986634f02ff3791244e4413c3901b1c0a424a805eb7a793b505 Jan 31 09:21:33 crc kubenswrapper[4732]: I0131 09:21:33.509972 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" event={"ID":"0c80b3a6-8701-4276-a6a2-80913e60ea9a","Type":"ContainerStarted","Data":"85b861719f4e2c096ba302360733974fc49e5be1c8a6dc54dda1f149625db608"} Jan 31 09:21:33 crc kubenswrapper[4732]: I0131 09:21:33.510487 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" event={"ID":"0c80b3a6-8701-4276-a6a2-80913e60ea9a","Type":"ContainerStarted","Data":"b5257e67474cd986634f02ff3791244e4413c3901b1c0a424a805eb7a793b505"} Jan 31 09:21:33 crc kubenswrapper[4732]: I0131 09:21:33.531875 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" podStartSLOduration=2.5318496809999997 podStartE2EDuration="2.531849681s" podCreationTimestamp="2026-01-31 09:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:33.528604921 +0000 UTC m=+1231.834481135" watchObservedRunningTime="2026-01-31 09:21:33.531849681 +0000 UTC m=+1231.837725885" Jan 31 09:21:36 crc kubenswrapper[4732]: I0131 09:21:36.020981 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:36 crc kubenswrapper[4732]: I0131 09:21:36.021294 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:36 crc kubenswrapper[4732]: E0131 09:21:36.021175 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:36 crc kubenswrapper[4732]: E0131 09:21:36.021477 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w: configmap "swift-ring-files" not found Jan 31 09:21:36 crc kubenswrapper[4732]: E0131 09:21:36.021519 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift podName:4d5994e0-d411-4d47-bcbb-1a12020906ce nodeName:}" failed. No retries permitted until 2026-01-31 09:21:44.021506706 +0000 UTC m=+1242.327382900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift") pod "swift-proxy-77c98d654c-ftt6w" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce") : configmap "swift-ring-files" not found Jan 31 09:21:36 crc kubenswrapper[4732]: E0131 09:21:36.021461 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:21:36 crc kubenswrapper[4732]: E0131 09:21:36.021811 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:21:36 crc kubenswrapper[4732]: E0131 09:21:36.021834 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift podName:2d99525a-cb49-44dd-82c0-0bf1641ec2b5 nodeName:}" failed. No retries permitted until 2026-01-31 09:21:44.021826616 +0000 UTC m=+1242.327702820 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift") pod "swift-storage-0" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5") : configmap "swift-ring-files" not found Jan 31 09:21:39 crc kubenswrapper[4732]: I0131 09:21:39.560365 4732 generic.go:334] "Generic (PLEG): container finished" podID="0c80b3a6-8701-4276-a6a2-80913e60ea9a" containerID="85b861719f4e2c096ba302360733974fc49e5be1c8a6dc54dda1f149625db608" exitCode=0 Jan 31 09:21:39 crc kubenswrapper[4732]: I0131 09:21:39.560461 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" event={"ID":"0c80b3a6-8701-4276-a6a2-80913e60ea9a","Type":"ContainerDied","Data":"85b861719f4e2c096ba302360733974fc49e5be1c8a6dc54dda1f149625db608"} Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.830952 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.989738 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c80b3a6-8701-4276-a6a2-80913e60ea9a-etc-swift\") pod \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.989804 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4957\" (UniqueName: \"kubernetes.io/projected/0c80b3a6-8701-4276-a6a2-80913e60ea9a-kube-api-access-t4957\") pod \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.989839 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-scripts\") pod \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.990601 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c80b3a6-8701-4276-a6a2-80913e60ea9a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0c80b3a6-8701-4276-a6a2-80913e60ea9a" (UID: "0c80b3a6-8701-4276-a6a2-80913e60ea9a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.990783 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-dispersionconf\") pod \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.990851 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-ring-data-devices\") pod \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.990920 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-swiftconf\") pod \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\" (UID: \"0c80b3a6-8701-4276-a6a2-80913e60ea9a\") " Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.991277 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0c80b3a6-8701-4276-a6a2-80913e60ea9a" (UID: "0c80b3a6-8701-4276-a6a2-80913e60ea9a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.991550 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.991573 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0c80b3a6-8701-4276-a6a2-80913e60ea9a-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:40 crc kubenswrapper[4732]: I0131 09:21:40.995057 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c80b3a6-8701-4276-a6a2-80913e60ea9a-kube-api-access-t4957" (OuterVolumeSpecName: "kube-api-access-t4957") pod "0c80b3a6-8701-4276-a6a2-80913e60ea9a" (UID: "0c80b3a6-8701-4276-a6a2-80913e60ea9a"). InnerVolumeSpecName "kube-api-access-t4957". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.007199 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-scripts" (OuterVolumeSpecName: "scripts") pod "0c80b3a6-8701-4276-a6a2-80913e60ea9a" (UID: "0c80b3a6-8701-4276-a6a2-80913e60ea9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.008609 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0c80b3a6-8701-4276-a6a2-80913e60ea9a" (UID: "0c80b3a6-8701-4276-a6a2-80913e60ea9a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.010431 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0c80b3a6-8701-4276-a6a2-80913e60ea9a" (UID: "0c80b3a6-8701-4276-a6a2-80913e60ea9a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.092391 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4957\" (UniqueName: \"kubernetes.io/projected/0c80b3a6-8701-4276-a6a2-80913e60ea9a-kube-api-access-t4957\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.092443 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b3a6-8701-4276-a6a2-80913e60ea9a-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.092458 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.092470 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0c80b3a6-8701-4276-a6a2-80913e60ea9a-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.575359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" event={"ID":"0c80b3a6-8701-4276-a6a2-80913e60ea9a","Type":"ContainerDied","Data":"b5257e67474cd986634f02ff3791244e4413c3901b1c0a424a805eb7a793b505"} Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.575398 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5257e67474cd986634f02ff3791244e4413c3901b1c0a424a805eb7a793b505" Jan 31 09:21:41 crc kubenswrapper[4732]: I0131 09:21:41.575471 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-vj28j" Jan 31 09:21:44 crc kubenswrapper[4732]: I0131 09:21:44.036542 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:44 crc kubenswrapper[4732]: I0131 09:21:44.036868 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:44 crc kubenswrapper[4732]: I0131 09:21:44.043811 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") pod \"swift-proxy-77c98d654c-ftt6w\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:44 crc kubenswrapper[4732]: I0131 09:21:44.048494 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") pod \"swift-storage-0\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:44 crc kubenswrapper[4732]: I0131 09:21:44.082207 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:21:44 crc kubenswrapper[4732]: I0131 09:21:44.099463 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:44 crc kubenswrapper[4732]: I0131 09:21:44.554874 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w"] Jan 31 09:21:44 crc kubenswrapper[4732]: I0131 09:21:44.602285 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" event={"ID":"4d5994e0-d411-4d47-bcbb-1a12020906ce","Type":"ContainerStarted","Data":"7aec8f717f6235c07faf20c4f2b84f8af9c000ef4913a0dcde781bd2a8cc7aa5"} Jan 31 09:21:44 crc kubenswrapper[4732]: W0131 09:21:44.634458 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d99525a_cb49_44dd_82c0_0bf1641ec2b5.slice/crio-bb37e542519324244c6258336774da979636b6257f7ee208e0822166576c6a8d WatchSource:0}: Error finding container bb37e542519324244c6258336774da979636b6257f7ee208e0822166576c6a8d: Status 404 returned error can't find the container with id bb37e542519324244c6258336774da979636b6257f7ee208e0822166576c6a8d Jan 31 09:21:44 crc kubenswrapper[4732]: I0131 09:21:44.637192 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.615180 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" event={"ID":"4d5994e0-d411-4d47-bcbb-1a12020906ce","Type":"ContainerStarted","Data":"9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5"} Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.615604 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.615621 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" event={"ID":"4d5994e0-d411-4d47-bcbb-1a12020906ce","Type":"ContainerStarted","Data":"db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1"} Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.615636 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.619877 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"86becf7472fc9c7648db29999d468c19bd3b7575b10b92c4b9928d2888a627aa"} Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.619919 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"4eb0e325fda950d987c29c36e29c511060e69238d602a70ab284eaba918266e8"} Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.619930 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"3ca70055c8d797fcc887af9ddd4c9b0ea366e17888bc18156cd3c6a4de4dc32b"} Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.619939 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"a064100c3bd45fa424991f4128e58ffe5bcccf2ec5e5089c19cf6692c22ab9c5"} Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.619947 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"47aaccf941b2ffbe89c21b9e0b3a892b4f21c0ac0a9cf298c23bf8752bd80a1a"} Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.619955 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"bb37e542519324244c6258336774da979636b6257f7ee208e0822166576c6a8d"} Jan 31 09:21:45 crc kubenswrapper[4732]: I0131 09:21:45.647545 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" podStartSLOduration=17.647512120000002 podStartE2EDuration="17.64751212s" podCreationTimestamp="2026-01-31 09:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:45.634920879 +0000 UTC m=+1243.940797093" watchObservedRunningTime="2026-01-31 09:21:45.64751212 +0000 UTC m=+1243.953388324" Jan 31 09:21:46 crc kubenswrapper[4732]: I0131 09:21:46.633126 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"a018ce2a323e1b0e2aa072d2f93334f3b4e39c300d0ff2335987aa34dde530ce"} Jan 31 09:21:46 crc kubenswrapper[4732]: I0131 09:21:46.633220 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"45ccea61db4e777ffe785a3bd9a6ec4740eee0e550c2a7527a6b1997cee06c20"} Jan 31 09:21:46 crc kubenswrapper[4732]: I0131 09:21:46.633241 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"7d05377fbb40377380b56c2e05a3671b30e05ae9c65dc9991a56bb3e0d776300"} Jan 31 09:21:46 crc kubenswrapper[4732]: I0131 09:21:46.633256 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"55b7d2c297a2c51bfe7bb69e4e050254e0b55aad5ce8bd28be166e236c03b5a4"} Jan 31 09:21:46 crc kubenswrapper[4732]: I0131 09:21:46.633270 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"e3b3c2c088602818d41287973de5c283cc9912805d7ff52354faa3f8b3e0f918"} Jan 31 09:21:47 crc kubenswrapper[4732]: I0131 09:21:47.648765 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"39108093c7167b1324969b66f03261329d95d03eef264a245346322307d2c2ca"} Jan 31 09:21:47 crc kubenswrapper[4732]: I0131 09:21:47.649334 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"3e318603930992b9a03d77a16fdd999a5fbbbc52598af387e3bde175cfb85fae"} Jan 31 09:21:47 crc kubenswrapper[4732]: I0131 09:21:47.649348 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"aafbf67742419bba0b805990b5fa8c2858a89d917bc8094ccad71c66aad20c3c"} Jan 31 09:21:47 crc kubenswrapper[4732]: I0131 09:21:47.649359 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"25397f43691b59da5a03ae8b3f0545fb630ecb37363d9a82b75ece89ba8f270b"} Jan 31 09:21:48 crc kubenswrapper[4732]: I0131 09:21:48.672604 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerStarted","Data":"545bc0375b464b90da540c011cf7e399ffc165b32303354c396c826e74d84e25"} Jan 31 09:21:48 crc kubenswrapper[4732]: I0131 09:21:48.714882 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=21.714860772 podStartE2EDuration="21.714860772s" podCreationTimestamp="2026-01-31 09:21:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:21:48.705222383 +0000 UTC m=+1247.011098597" watchObservedRunningTime="2026-01-31 09:21:48.714860772 +0000 UTC m=+1247.020736996" Jan 31 09:21:49 crc kubenswrapper[4732]: I0131 09:21:49.109799 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:49 crc kubenswrapper[4732]: I0131 09:21:49.112448 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.374834 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-vj28j"] Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.381348 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-vj28j"] Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.397484 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.553349 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c80b3a6-8701-4276-a6a2-80913e60ea9a" path="/var/lib/kubelet/pods/0c80b3a6-8701-4276-a6a2-80913e60ea9a/volumes" Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.569065 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w"] Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.702892 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-server" containerID="cri-o://47aaccf941b2ffbe89c21b9e0b3a892b4f21c0ac0a9cf298c23bf8752bd80a1a" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703060 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-replicator" containerID="cri-o://e3b3c2c088602818d41287973de5c283cc9912805d7ff52354faa3f8b3e0f918" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703097 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-replicator" containerID="cri-o://a018ce2a323e1b0e2aa072d2f93334f3b4e39c300d0ff2335987aa34dde530ce" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703105 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-updater" containerID="cri-o://7d05377fbb40377380b56c2e05a3671b30e05ae9c65dc9991a56bb3e0d776300" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703161 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-auditor" containerID="cri-o://55b7d2c297a2c51bfe7bb69e4e050254e0b55aad5ce8bd28be166e236c03b5a4" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703181 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-reaper" containerID="cri-o://4eb0e325fda950d987c29c36e29c511060e69238d602a70ab284eaba918266e8" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703109 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-auditor" containerID="cri-o://3ca70055c8d797fcc887af9ddd4c9b0ea366e17888bc18156cd3c6a4de4dc32b" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703166 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-server" containerID="cri-o://86becf7472fc9c7648db29999d468c19bd3b7575b10b92c4b9928d2888a627aa" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703149 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="swift-recon-cron" containerID="cri-o://545bc0375b464b90da540c011cf7e399ffc165b32303354c396c826e74d84e25" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703105 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-server" containerID="cri-o://45ccea61db4e777ffe785a3bd9a6ec4740eee0e550c2a7527a6b1997cee06c20" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703295 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-expirer" containerID="cri-o://3e318603930992b9a03d77a16fdd999a5fbbbc52598af387e3bde175cfb85fae" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703325 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-auditor" containerID="cri-o://25397f43691b59da5a03ae8b3f0545fb630ecb37363d9a82b75ece89ba8f270b" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703332 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-updater" containerID="cri-o://aafbf67742419bba0b805990b5fa8c2858a89d917bc8094ccad71c66aad20c3c" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703301 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="rsync" containerID="cri-o://39108093c7167b1324969b66f03261329d95d03eef264a245346322307d2c2ca" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703132 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-replicator" containerID="cri-o://a064100c3bd45fa424991f4128e58ffe5bcccf2ec5e5089c19cf6692c22ab9c5" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703382 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" podUID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerName="proxy-httpd" containerID="cri-o://db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1" gracePeriod=30 Jan 31 09:21:50 crc kubenswrapper[4732]: I0131 09:21:50.703422 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" podUID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerName="proxy-server" containerID="cri-o://9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5" gracePeriod=30 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.297424 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.451540 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5994e0-d411-4d47-bcbb-1a12020906ce-config-data\") pod \"4d5994e0-d411-4d47-bcbb-1a12020906ce\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.451629 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") pod \"4d5994e0-d411-4d47-bcbb-1a12020906ce\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.451688 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-run-httpd\") pod \"4d5994e0-d411-4d47-bcbb-1a12020906ce\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.451731 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-log-httpd\") pod \"4d5994e0-d411-4d47-bcbb-1a12020906ce\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.451856 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njt7j\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-kube-api-access-njt7j\") pod \"4d5994e0-d411-4d47-bcbb-1a12020906ce\" (UID: \"4d5994e0-d411-4d47-bcbb-1a12020906ce\") " Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.452091 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4d5994e0-d411-4d47-bcbb-1a12020906ce" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.452425 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4d5994e0-d411-4d47-bcbb-1a12020906ce" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.457860 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-kube-api-access-njt7j" (OuterVolumeSpecName: "kube-api-access-njt7j") pod "4d5994e0-d411-4d47-bcbb-1a12020906ce" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce"). InnerVolumeSpecName "kube-api-access-njt7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.458913 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4d5994e0-d411-4d47-bcbb-1a12020906ce" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.504896 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d5994e0-d411-4d47-bcbb-1a12020906ce-config-data" (OuterVolumeSpecName: "config-data") pod "4d5994e0-d411-4d47-bcbb-1a12020906ce" (UID: "4d5994e0-d411-4d47-bcbb-1a12020906ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.553790 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.553824 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.553834 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d5994e0-d411-4d47-bcbb-1a12020906ce-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.553845 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njt7j\" (UniqueName: \"kubernetes.io/projected/4d5994e0-d411-4d47-bcbb-1a12020906ce-kube-api-access-njt7j\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.553855 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d5994e0-d411-4d47-bcbb-1a12020906ce-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.714475 4732 generic.go:334] "Generic (PLEG): container finished" podID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerID="9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.714523 4732 generic.go:334] "Generic (PLEG): container finished" podID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerID="db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.714794 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.714804 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" event={"ID":"4d5994e0-d411-4d47-bcbb-1a12020906ce","Type":"ContainerDied","Data":"9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.714866 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" event={"ID":"4d5994e0-d411-4d47-bcbb-1a12020906ce","Type":"ContainerDied","Data":"db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.714880 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w" event={"ID":"4d5994e0-d411-4d47-bcbb-1a12020906ce","Type":"ContainerDied","Data":"7aec8f717f6235c07faf20c4f2b84f8af9c000ef4913a0dcde781bd2a8cc7aa5"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.714901 4732 scope.go:117] "RemoveContainer" containerID="9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.721939 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="39108093c7167b1324969b66f03261329d95d03eef264a245346322307d2c2ca" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.721981 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="3e318603930992b9a03d77a16fdd999a5fbbbc52598af387e3bde175cfb85fae" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.721992 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="aafbf67742419bba0b805990b5fa8c2858a89d917bc8094ccad71c66aad20c3c" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722003 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="25397f43691b59da5a03ae8b3f0545fb630ecb37363d9a82b75ece89ba8f270b" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722013 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="a018ce2a323e1b0e2aa072d2f93334f3b4e39c300d0ff2335987aa34dde530ce" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722023 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="45ccea61db4e777ffe785a3bd9a6ec4740eee0e550c2a7527a6b1997cee06c20" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722033 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="7d05377fbb40377380b56c2e05a3671b30e05ae9c65dc9991a56bb3e0d776300" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722042 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="55b7d2c297a2c51bfe7bb69e4e050254e0b55aad5ce8bd28be166e236c03b5a4" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722050 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="e3b3c2c088602818d41287973de5c283cc9912805d7ff52354faa3f8b3e0f918" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722058 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="86becf7472fc9c7648db29999d468c19bd3b7575b10b92c4b9928d2888a627aa" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722069 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="4eb0e325fda950d987c29c36e29c511060e69238d602a70ab284eaba918266e8" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722078 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="3ca70055c8d797fcc887af9ddd4c9b0ea366e17888bc18156cd3c6a4de4dc32b" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722087 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="a064100c3bd45fa424991f4128e58ffe5bcccf2ec5e5089c19cf6692c22ab9c5" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722095 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="47aaccf941b2ffbe89c21b9e0b3a892b4f21c0ac0a9cf298c23bf8752bd80a1a" exitCode=0 Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722117 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"39108093c7167b1324969b66f03261329d95d03eef264a245346322307d2c2ca"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722146 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"3e318603930992b9a03d77a16fdd999a5fbbbc52598af387e3bde175cfb85fae"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722160 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"aafbf67742419bba0b805990b5fa8c2858a89d917bc8094ccad71c66aad20c3c"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722176 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"25397f43691b59da5a03ae8b3f0545fb630ecb37363d9a82b75ece89ba8f270b"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722190 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"a018ce2a323e1b0e2aa072d2f93334f3b4e39c300d0ff2335987aa34dde530ce"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722202 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"45ccea61db4e777ffe785a3bd9a6ec4740eee0e550c2a7527a6b1997cee06c20"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722210 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"7d05377fbb40377380b56c2e05a3671b30e05ae9c65dc9991a56bb3e0d776300"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722221 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"55b7d2c297a2c51bfe7bb69e4e050254e0b55aad5ce8bd28be166e236c03b5a4"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722233 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"e3b3c2c088602818d41287973de5c283cc9912805d7ff52354faa3f8b3e0f918"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722245 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"86becf7472fc9c7648db29999d468c19bd3b7575b10b92c4b9928d2888a627aa"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722256 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"4eb0e325fda950d987c29c36e29c511060e69238d602a70ab284eaba918266e8"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722264 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"3ca70055c8d797fcc887af9ddd4c9b0ea366e17888bc18156cd3c6a4de4dc32b"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722274 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"a064100c3bd45fa424991f4128e58ffe5bcccf2ec5e5089c19cf6692c22ab9c5"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.722284 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"47aaccf941b2ffbe89c21b9e0b3a892b4f21c0ac0a9cf298c23bf8752bd80a1a"} Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.735303 4732 scope.go:117] "RemoveContainer" containerID="db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.747901 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w"] Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.753573 4732 scope.go:117] "RemoveContainer" containerID="9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5" Jan 31 09:21:51 crc kubenswrapper[4732]: E0131 09:21:51.754632 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5\": container with ID starting with 9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5 not found: ID does not exist" containerID="9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.754706 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5"} err="failed to get container status \"9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5\": rpc error: code = NotFound desc = could not find container \"9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5\": container with ID starting with 9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5 not found: ID does not exist" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.754743 4732 scope.go:117] "RemoveContainer" containerID="db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1" Jan 31 09:21:51 crc kubenswrapper[4732]: E0131 09:21:51.755059 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1\": container with ID starting with db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1 not found: ID does not exist" containerID="db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.755100 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1"} err="failed to get container status \"db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1\": rpc error: code = NotFound desc = could not find container \"db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1\": container with ID starting with db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1 not found: ID does not exist" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.755119 4732 scope.go:117] "RemoveContainer" containerID="9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.755376 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5"} err="failed to get container status \"9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5\": rpc error: code = NotFound desc = could not find container \"9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5\": container with ID starting with 9b1ff239a7bac22b3621f4ddf89b0954862df7c17ff58d0be28c2216ee9690d5 not found: ID does not exist" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.755405 4732 scope.go:117] "RemoveContainer" containerID="db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.755725 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1"} err="failed to get container status \"db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1\": rpc error: code = NotFound desc = could not find container \"db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1\": container with ID starting with db0d291a904e63a804b3e8e9f1722ea742a8b5887a09628e8fd5a89d8eceb3a1 not found: ID does not exist" Jan 31 09:21:51 crc kubenswrapper[4732]: I0131 09:21:51.760871 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-77c98d654c-ftt6w"] Jan 31 09:21:52 crc kubenswrapper[4732]: I0131 09:21:52.557998 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5994e0-d411-4d47-bcbb-1a12020906ce" path="/var/lib/kubelet/pods/4d5994e0-d411-4d47-bcbb-1a12020906ce/volumes" Jan 31 09:22:17 crc kubenswrapper[4732]: I0131 09:22:17.497409 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:22:17 crc kubenswrapper[4732]: I0131 09:22:17.497984 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:22:20 crc kubenswrapper[4732]: E0131 09:22:20.904996 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d99525a_cb49_44dd_82c0_0bf1641ec2b5.slice/crio-conmon-545bc0375b464b90da540c011cf7e399ffc165b32303354c396c826e74d84e25.scope\": RecentStats: unable to find data in memory cache]" Jan 31 09:22:20 crc kubenswrapper[4732]: I0131 09:22:20.981413 4732 generic.go:334] "Generic (PLEG): container finished" podID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerID="545bc0375b464b90da540c011cf7e399ffc165b32303354c396c826e74d84e25" exitCode=137 Jan 31 09:22:20 crc kubenswrapper[4732]: I0131 09:22:20.981777 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"545bc0375b464b90da540c011cf7e399ffc165b32303354c396c826e74d84e25"} Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.047885 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.188036 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qccj\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-kube-api-access-8qccj\") pod \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.188116 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-lock\") pod \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.188141 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-cache\") pod \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.188184 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.188209 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") pod \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\" (UID: \"2d99525a-cb49-44dd-82c0-0bf1641ec2b5\") " Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.188721 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-lock" (OuterVolumeSpecName: "lock") pod "2d99525a-cb49-44dd-82c0-0bf1641ec2b5" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.188933 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-cache" (OuterVolumeSpecName: "cache") pod "2d99525a-cb49-44dd-82c0-0bf1641ec2b5" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.194215 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "swift") pod "2d99525a-cb49-44dd-82c0-0bf1641ec2b5" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.194240 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2d99525a-cb49-44dd-82c0-0bf1641ec2b5" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.194266 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-kube-api-access-8qccj" (OuterVolumeSpecName: "kube-api-access-8qccj") pod "2d99525a-cb49-44dd-82c0-0bf1641ec2b5" (UID: "2d99525a-cb49-44dd-82c0-0bf1641ec2b5"). InnerVolumeSpecName "kube-api-access-8qccj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.290481 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qccj\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-kube-api-access-8qccj\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.290533 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.290550 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-cache\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.290595 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.290612 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2d99525a-cb49-44dd-82c0-0bf1641ec2b5-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.311530 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 31 09:22:21 crc kubenswrapper[4732]: I0131 09:22:21.392284 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.032816 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"2d99525a-cb49-44dd-82c0-0bf1641ec2b5","Type":"ContainerDied","Data":"bb37e542519324244c6258336774da979636b6257f7ee208e0822166576c6a8d"} Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.032903 4732 scope.go:117] "RemoveContainer" containerID="545bc0375b464b90da540c011cf7e399ffc165b32303354c396c826e74d84e25" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.033051 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.056745 4732 scope.go:117] "RemoveContainer" containerID="39108093c7167b1324969b66f03261329d95d03eef264a245346322307d2c2ca" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.082571 4732 scope.go:117] "RemoveContainer" containerID="3e318603930992b9a03d77a16fdd999a5fbbbc52598af387e3bde175cfb85fae" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.093223 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.103073 4732 scope.go:117] "RemoveContainer" containerID="aafbf67742419bba0b805990b5fa8c2858a89d917bc8094ccad71c66aad20c3c" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.104910 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.116903 4732 scope.go:117] "RemoveContainer" containerID="25397f43691b59da5a03ae8b3f0545fb630ecb37363d9a82b75ece89ba8f270b" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.130639 4732 scope.go:117] "RemoveContainer" containerID="a018ce2a323e1b0e2aa072d2f93334f3b4e39c300d0ff2335987aa34dde530ce" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.145557 4732 scope.go:117] "RemoveContainer" containerID="45ccea61db4e777ffe785a3bd9a6ec4740eee0e550c2a7527a6b1997cee06c20" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.161595 4732 scope.go:117] "RemoveContainer" containerID="7d05377fbb40377380b56c2e05a3671b30e05ae9c65dc9991a56bb3e0d776300" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.176431 4732 scope.go:117] "RemoveContainer" containerID="55b7d2c297a2c51bfe7bb69e4e050254e0b55aad5ce8bd28be166e236c03b5a4" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.192076 4732 scope.go:117] "RemoveContainer" containerID="e3b3c2c088602818d41287973de5c283cc9912805d7ff52354faa3f8b3e0f918" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.208467 4732 scope.go:117] "RemoveContainer" containerID="86becf7472fc9c7648db29999d468c19bd3b7575b10b92c4b9928d2888a627aa" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.230549 4732 scope.go:117] "RemoveContainer" containerID="4eb0e325fda950d987c29c36e29c511060e69238d602a70ab284eaba918266e8" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.245881 4732 scope.go:117] "RemoveContainer" containerID="3ca70055c8d797fcc887af9ddd4c9b0ea366e17888bc18156cd3c6a4de4dc32b" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.262481 4732 scope.go:117] "RemoveContainer" containerID="a064100c3bd45fa424991f4128e58ffe5bcccf2ec5e5089c19cf6692c22ab9c5" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.281550 4732 scope.go:117] "RemoveContainer" containerID="47aaccf941b2ffbe89c21b9e0b3a892b4f21c0ac0a9cf298c23bf8752bd80a1a" Jan 31 09:22:22 crc kubenswrapper[4732]: I0131 09:22:22.550545 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" path="/var/lib/kubelet/pods/2d99525a-cb49-44dd-82c0-0bf1641ec2b5/volumes" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.108770 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm"] Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109074 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-server" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109090 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-server" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109097 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-auditor" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109103 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-auditor" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109120 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerName="proxy-httpd" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109126 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerName="proxy-httpd" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109139 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="swift-recon-cron" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109145 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="swift-recon-cron" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109151 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-auditor" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109157 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-auditor" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109163 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-replicator" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109169 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-replicator" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109178 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-server" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109183 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-server" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109191 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c80b3a6-8701-4276-a6a2-80913e60ea9a" containerName="swift-ring-rebalance" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109196 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c80b3a6-8701-4276-a6a2-80913e60ea9a" containerName="swift-ring-rebalance" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109207 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-replicator" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109213 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-replicator" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109224 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-auditor" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109229 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-auditor" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109238 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-replicator" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109244 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-replicator" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109255 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-reaper" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109260 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-reaper" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109266 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-updater" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109274 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-updater" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109280 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerName="proxy-server" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109285 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerName="proxy-server" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109294 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-updater" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109299 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-updater" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109306 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="rsync" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109311 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="rsync" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109319 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-server" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109325 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-server" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.109334 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-expirer" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109341 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-expirer" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109463 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-server" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109472 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-auditor" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109481 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-server" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109487 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-updater" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109495 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-updater" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109504 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-replicator" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109512 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-replicator" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109520 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c80b3a6-8701-4276-a6a2-80913e60ea9a" containerName="swift-ring-rebalance" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109528 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-auditor" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109536 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-server" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109544 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-replicator" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109552 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerName="proxy-httpd" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109560 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="container-auditor" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109568 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="object-expirer" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109576 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5994e0-d411-4d47-bcbb-1a12020906ce" containerName="proxy-server" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109585 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="account-reaper" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109592 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="rsync" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.109600 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d99525a-cb49-44dd-82c0-0bf1641ec2b5" containerName="swift-recon-cron" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.110325 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.113898 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-rq6ln" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.113933 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.113898 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.114021 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"cert-swift-internal-svc" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.114052 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.114250 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"combined-ca-bundle" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.117454 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"cert-swift-public-svc" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.127822 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm"] Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.180308 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.184590 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.191946 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.197037 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.233142 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-run-httpd\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.233199 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmr65\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-kube-api-access-nmr65\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.233223 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.233306 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-combined-ca-bundle\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.233344 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-internal-tls-certs\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.233438 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-config-data\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.233484 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-log-httpd\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.233521 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-public-tls-certs\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334477 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-internal-tls-certs\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334524 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334546 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334568 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-config-data\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334583 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-log-httpd\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334603 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-lock\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334620 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-public-tls-certs\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334642 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334705 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sjw2\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-kube-api-access-7sjw2\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334742 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-run-httpd\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334775 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmr65\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-kube-api-access-nmr65\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334797 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334835 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-cache\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.334883 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-combined-ca-bundle\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.335944 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-run-httpd\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.336029 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.336045 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm: configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.336083 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift podName:bdedfde8-2a77-4328-8d12-1ed7e7c383d7 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:24.83606696 +0000 UTC m=+1283.141943164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift") pod "swift-proxy-5c474fc7f4-zrlwm" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7") : configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.336029 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-log-httpd\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.340397 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-combined-ca-bundle\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.340664 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-internal-tls-certs\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.341723 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-config-data\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.341850 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-public-tls-certs\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.356366 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmr65\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-kube-api-access-nmr65\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.436403 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.436459 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.436500 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-lock\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.436531 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.436573 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sjw2\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-kube-api-access-7sjw2\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.436633 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-cache\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.437150 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-cache\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.437273 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.437295 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.437337 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift podName:ba8f0576-6adb-407c-b8e0-e4b04f0d47e3 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:24.937321337 +0000 UTC m=+1283.243197541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift") pod "swift-storage-0" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3") : configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.437880 4732 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") device mount path \"/mnt/openstack/pv11\"" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.438159 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-lock\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.441438 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.455872 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.457460 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sjw2\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-kube-api-access-7sjw2\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.568088 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-mkrwv"] Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.568904 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.573075 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.573865 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.577248 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-mkrwv"] Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.740451 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-scripts\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.740502 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9np9m\" (UniqueName: \"kubernetes.io/projected/84cedd57-5030-425a-8567-ceeda6aa0109-kube-api-access-9np9m\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.740585 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-dispersionconf\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.740731 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-ring-data-devices\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.740829 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-combined-ca-bundle\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.740954 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-swiftconf\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.741013 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/84cedd57-5030-425a-8567-ceeda6aa0109-etc-swift\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.842189 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9np9m\" (UniqueName: \"kubernetes.io/projected/84cedd57-5030-425a-8567-ceeda6aa0109-kube-api-access-9np9m\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.842270 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-dispersionconf\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.842323 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-ring-data-devices\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.842384 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-combined-ca-bundle\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.842447 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.842497 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-swiftconf\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.842577 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/84cedd57-5030-425a-8567-ceeda6aa0109-etc-swift\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.842685 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-scripts\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.843135 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.843198 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm: configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.843309 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift podName:bdedfde8-2a77-4328-8d12-1ed7e7c383d7 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:25.843260255 +0000 UTC m=+1284.149136499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift") pod "swift-proxy-5c474fc7f4-zrlwm" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7") : configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.844088 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/84cedd57-5030-425a-8567-ceeda6aa0109-etc-swift\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.844319 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-scripts\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.844835 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-ring-data-devices\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.848266 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-combined-ca-bundle\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.848429 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-swiftconf\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.852298 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-dispersionconf\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.861384 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9np9m\" (UniqueName: \"kubernetes.io/projected/84cedd57-5030-425a-8567-ceeda6aa0109-kube-api-access-9np9m\") pod \"swift-ring-rebalance-mkrwv\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.890539 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:24 crc kubenswrapper[4732]: I0131 09:22:24.944083 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.944302 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.944460 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:22:24 crc kubenswrapper[4732]: E0131 09:22:24.944515 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift podName:ba8f0576-6adb-407c-b8e0-e4b04f0d47e3 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:25.944497342 +0000 UTC m=+1284.250373546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift") pod "swift-storage-0" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3") : configmap "swift-ring-files" not found Jan 31 09:22:25 crc kubenswrapper[4732]: I0131 09:22:25.395103 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-mkrwv"] Jan 31 09:22:25 crc kubenswrapper[4732]: W0131 09:22:25.405930 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84cedd57_5030_425a_8567_ceeda6aa0109.slice/crio-96c45711338fec70c8940230429e8a7ce9d16f40c43935822502c794635844c0 WatchSource:0}: Error finding container 96c45711338fec70c8940230429e8a7ce9d16f40c43935822502c794635844c0: Status 404 returned error can't find the container with id 96c45711338fec70c8940230429e8a7ce9d16f40c43935822502c794635844c0 Jan 31 09:22:25 crc kubenswrapper[4732]: I0131 09:22:25.857679 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:25 crc kubenswrapper[4732]: E0131 09:22:25.857807 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:25 crc kubenswrapper[4732]: E0131 09:22:25.857827 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm: configmap "swift-ring-files" not found Jan 31 09:22:25 crc kubenswrapper[4732]: E0131 09:22:25.857885 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift podName:bdedfde8-2a77-4328-8d12-1ed7e7c383d7 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:27.857868173 +0000 UTC m=+1286.163744377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift") pod "swift-proxy-5c474fc7f4-zrlwm" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7") : configmap "swift-ring-files" not found Jan 31 09:22:25 crc kubenswrapper[4732]: I0131 09:22:25.959664 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:25 crc kubenswrapper[4732]: E0131 09:22:25.959871 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:25 crc kubenswrapper[4732]: E0131 09:22:25.959975 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:22:25 crc kubenswrapper[4732]: E0131 09:22:25.960023 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift podName:ba8f0576-6adb-407c-b8e0-e4b04f0d47e3 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:27.960007878 +0000 UTC m=+1286.265884082 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift") pod "swift-storage-0" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3") : configmap "swift-ring-files" not found Jan 31 09:22:26 crc kubenswrapper[4732]: I0131 09:22:26.068140 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" event={"ID":"84cedd57-5030-425a-8567-ceeda6aa0109","Type":"ContainerStarted","Data":"e9a704782d501296be317dbceec99e3b7ae21d704b38e927a87174fe7c4bd3f1"} Jan 31 09:22:26 crc kubenswrapper[4732]: I0131 09:22:26.068180 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" event={"ID":"84cedd57-5030-425a-8567-ceeda6aa0109","Type":"ContainerStarted","Data":"96c45711338fec70c8940230429e8a7ce9d16f40c43935822502c794635844c0"} Jan 31 09:22:26 crc kubenswrapper[4732]: I0131 09:22:26.089523 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" podStartSLOduration=2.089504661 podStartE2EDuration="2.089504661s" podCreationTimestamp="2026-01-31 09:22:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:22:26.087292762 +0000 UTC m=+1284.393168976" watchObservedRunningTime="2026-01-31 09:22:26.089504661 +0000 UTC m=+1284.395380865" Jan 31 09:22:27 crc kubenswrapper[4732]: I0131 09:22:27.893417 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:27 crc kubenswrapper[4732]: E0131 09:22:27.893650 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:27 crc kubenswrapper[4732]: E0131 09:22:27.894041 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm: configmap "swift-ring-files" not found Jan 31 09:22:27 crc kubenswrapper[4732]: E0131 09:22:27.894104 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift podName:bdedfde8-2a77-4328-8d12-1ed7e7c383d7 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:31.894083785 +0000 UTC m=+1290.199959989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift") pod "swift-proxy-5c474fc7f4-zrlwm" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7") : configmap "swift-ring-files" not found Jan 31 09:22:27 crc kubenswrapper[4732]: I0131 09:22:27.995807 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:27 crc kubenswrapper[4732]: E0131 09:22:27.996251 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:27 crc kubenswrapper[4732]: E0131 09:22:27.996353 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:22:27 crc kubenswrapper[4732]: E0131 09:22:27.996479 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift podName:ba8f0576-6adb-407c-b8e0-e4b04f0d47e3 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:31.996452608 +0000 UTC m=+1290.302328842 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift") pod "swift-storage-0" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3") : configmap "swift-ring-files" not found Jan 31 09:22:31 crc kubenswrapper[4732]: I0131 09:22:31.955930 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:31 crc kubenswrapper[4732]: E0131 09:22:31.956177 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:31 crc kubenswrapper[4732]: E0131 09:22:31.956885 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm: configmap "swift-ring-files" not found Jan 31 09:22:31 crc kubenswrapper[4732]: E0131 09:22:31.956968 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift podName:bdedfde8-2a77-4328-8d12-1ed7e7c383d7 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:39.956943204 +0000 UTC m=+1298.262819408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift") pod "swift-proxy-5c474fc7f4-zrlwm" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7") : configmap "swift-ring-files" not found Jan 31 09:22:32 crc kubenswrapper[4732]: I0131 09:22:32.057950 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:32 crc kubenswrapper[4732]: E0131 09:22:32.058156 4732 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 09:22:32 crc kubenswrapper[4732]: E0131 09:22:32.058170 4732 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 09:22:32 crc kubenswrapper[4732]: E0131 09:22:32.058214 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift podName:ba8f0576-6adb-407c-b8e0-e4b04f0d47e3 nodeName:}" failed. No retries permitted until 2026-01-31 09:22:40.058198942 +0000 UTC m=+1298.364075136 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift") pod "swift-storage-0" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3") : configmap "swift-ring-files" not found Jan 31 09:22:33 crc kubenswrapper[4732]: I0131 09:22:33.146243 4732 generic.go:334] "Generic (PLEG): container finished" podID="84cedd57-5030-425a-8567-ceeda6aa0109" containerID="e9a704782d501296be317dbceec99e3b7ae21d704b38e927a87174fe7c4bd3f1" exitCode=0 Jan 31 09:22:33 crc kubenswrapper[4732]: I0131 09:22:33.146291 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" event={"ID":"84cedd57-5030-425a-8567-ceeda6aa0109","Type":"ContainerDied","Data":"e9a704782d501296be317dbceec99e3b7ae21d704b38e927a87174fe7c4bd3f1"} Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.450493 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.499622 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/84cedd57-5030-425a-8567-ceeda6aa0109-etc-swift\") pod \"84cedd57-5030-425a-8567-ceeda6aa0109\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.499748 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-scripts\") pod \"84cedd57-5030-425a-8567-ceeda6aa0109\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.499791 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-swiftconf\") pod \"84cedd57-5030-425a-8567-ceeda6aa0109\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.499872 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9np9m\" (UniqueName: \"kubernetes.io/projected/84cedd57-5030-425a-8567-ceeda6aa0109-kube-api-access-9np9m\") pod \"84cedd57-5030-425a-8567-ceeda6aa0109\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.499909 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-dispersionconf\") pod \"84cedd57-5030-425a-8567-ceeda6aa0109\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.499952 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-combined-ca-bundle\") pod \"84cedd57-5030-425a-8567-ceeda6aa0109\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.500085 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-ring-data-devices\") pod \"84cedd57-5030-425a-8567-ceeda6aa0109\" (UID: \"84cedd57-5030-425a-8567-ceeda6aa0109\") " Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.501290 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "84cedd57-5030-425a-8567-ceeda6aa0109" (UID: "84cedd57-5030-425a-8567-ceeda6aa0109"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.502897 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84cedd57-5030-425a-8567-ceeda6aa0109-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "84cedd57-5030-425a-8567-ceeda6aa0109" (UID: "84cedd57-5030-425a-8567-ceeda6aa0109"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.508964 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84cedd57-5030-425a-8567-ceeda6aa0109-kube-api-access-9np9m" (OuterVolumeSpecName: "kube-api-access-9np9m") pod "84cedd57-5030-425a-8567-ceeda6aa0109" (UID: "84cedd57-5030-425a-8567-ceeda6aa0109"). InnerVolumeSpecName "kube-api-access-9np9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.526140 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-scripts" (OuterVolumeSpecName: "scripts") pod "84cedd57-5030-425a-8567-ceeda6aa0109" (UID: "84cedd57-5030-425a-8567-ceeda6aa0109"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.526236 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "84cedd57-5030-425a-8567-ceeda6aa0109" (UID: "84cedd57-5030-425a-8567-ceeda6aa0109"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.547644 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84cedd57-5030-425a-8567-ceeda6aa0109" (UID: "84cedd57-5030-425a-8567-ceeda6aa0109"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.547728 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "84cedd57-5030-425a-8567-ceeda6aa0109" (UID: "84cedd57-5030-425a-8567-ceeda6aa0109"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.601481 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9np9m\" (UniqueName: \"kubernetes.io/projected/84cedd57-5030-425a-8567-ceeda6aa0109-kube-api-access-9np9m\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.601514 4732 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.601523 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.601532 4732 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.601541 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/84cedd57-5030-425a-8567-ceeda6aa0109-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.601550 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/84cedd57-5030-425a-8567-ceeda6aa0109-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:34 crc kubenswrapper[4732]: I0131 09:22:34.601560 4732 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/84cedd57-5030-425a-8567-ceeda6aa0109-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:35 crc kubenswrapper[4732]: I0131 09:22:35.162345 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" event={"ID":"84cedd57-5030-425a-8567-ceeda6aa0109","Type":"ContainerDied","Data":"96c45711338fec70c8940230429e8a7ce9d16f40c43935822502c794635844c0"} Jan 31 09:22:35 crc kubenswrapper[4732]: I0131 09:22:35.162402 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96c45711338fec70c8940230429e8a7ce9d16f40c43935822502c794635844c0" Jan 31 09:22:35 crc kubenswrapper[4732]: I0131 09:22:35.162427 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-mkrwv" Jan 31 09:22:39 crc kubenswrapper[4732]: I0131 09:22:39.980222 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:39 crc kubenswrapper[4732]: I0131 09:22:39.989876 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") pod \"swift-proxy-5c474fc7f4-zrlwm\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:40 crc kubenswrapper[4732]: I0131 09:22:40.025304 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:40 crc kubenswrapper[4732]: I0131 09:22:40.081843 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:40 crc kubenswrapper[4732]: I0131 09:22:40.087946 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") pod \"swift-storage-0\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:40 crc kubenswrapper[4732]: I0131 09:22:40.100351 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:22:40 crc kubenswrapper[4732]: I0131 09:22:40.443584 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm"] Jan 31 09:22:40 crc kubenswrapper[4732]: I0131 09:22:40.553883 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:22:40 crc kubenswrapper[4732]: W0131 09:22:40.554378 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba8f0576_6adb_407c_b8e0_e4b04f0d47e3.slice/crio-db289e5d4324b07ffd624072a8984010d2c9a44d1db929557bee6f1a1928fda3 WatchSource:0}: Error finding container db289e5d4324b07ffd624072a8984010d2c9a44d1db929557bee6f1a1928fda3: Status 404 returned error can't find the container with id db289e5d4324b07ffd624072a8984010d2c9a44d1db929557bee6f1a1928fda3 Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.239495 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" event={"ID":"bdedfde8-2a77-4328-8d12-1ed7e7c383d7","Type":"ContainerStarted","Data":"474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe"} Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.239789 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" event={"ID":"bdedfde8-2a77-4328-8d12-1ed7e7c383d7","Type":"ContainerStarted","Data":"ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69"} Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.239803 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" event={"ID":"bdedfde8-2a77-4328-8d12-1ed7e7c383d7","Type":"ContainerStarted","Data":"1f6dfd5148a93a535a200ac9b68a844f393cc7142eb0dbb00e04c6a279302eea"} Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.241713 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.241742 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.244007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e"} Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.244050 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9"} Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.244059 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572"} Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.244067 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f"} Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.244077 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"db289e5d4324b07ffd624072a8984010d2c9a44d1db929557bee6f1a1928fda3"} Jan 31 09:22:41 crc kubenswrapper[4732]: I0131 09:22:41.268863 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" podStartSLOduration=17.268846254 podStartE2EDuration="17.268846254s" podCreationTimestamp="2026-01-31 09:22:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:22:41.264778188 +0000 UTC m=+1299.570654392" watchObservedRunningTime="2026-01-31 09:22:41.268846254 +0000 UTC m=+1299.574722458" Jan 31 09:22:42 crc kubenswrapper[4732]: I0131 09:22:42.259427 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f"} Jan 31 09:22:42 crc kubenswrapper[4732]: I0131 09:22:42.259684 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee"} Jan 31 09:22:42 crc kubenswrapper[4732]: I0131 09:22:42.259710 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee"} Jan 31 09:22:42 crc kubenswrapper[4732]: I0131 09:22:42.259721 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03"} Jan 31 09:22:42 crc kubenswrapper[4732]: I0131 09:22:42.259732 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32"} Jan 31 09:22:42 crc kubenswrapper[4732]: I0131 09:22:42.259743 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922"} Jan 31 09:22:43 crc kubenswrapper[4732]: I0131 09:22:43.274406 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c"} Jan 31 09:22:43 crc kubenswrapper[4732]: I0131 09:22:43.275464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff"} Jan 31 09:22:43 crc kubenswrapper[4732]: I0131 09:22:43.275550 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c"} Jan 31 09:22:43 crc kubenswrapper[4732]: I0131 09:22:43.275615 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717"} Jan 31 09:22:43 crc kubenswrapper[4732]: I0131 09:22:43.275685 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerStarted","Data":"17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f"} Jan 31 09:22:45 crc kubenswrapper[4732]: I0131 09:22:45.035786 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:45 crc kubenswrapper[4732]: I0131 09:22:45.039108 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:45 crc kubenswrapper[4732]: I0131 09:22:45.061813 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=22.061794319 podStartE2EDuration="22.061794319s" podCreationTimestamp="2026-01-31 09:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:22:43.330419252 +0000 UTC m=+1301.636295546" watchObservedRunningTime="2026-01-31 09:22:45.061794319 +0000 UTC m=+1303.367670523" Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.578952 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580795 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-server" containerID="cri-o://cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580963 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-auditor" containerID="cri-o://08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580997 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-expirer" containerID="cri-o://d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580997 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-replicator" containerID="cri-o://4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.581056 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-replicator" containerID="cri-o://a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580902 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-auditor" containerID="cri-o://c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.581088 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-updater" containerID="cri-o://17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.581136 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="rsync" containerID="cri-o://5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580943 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-auditor" containerID="cri-o://3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.581163 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-replicator" containerID="cri-o://78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580987 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-updater" containerID="cri-o://92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580953 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-reaper" containerID="cri-o://13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580955 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-server" containerID="cri-o://f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.580855 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-server" containerID="cri-o://a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.581062 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="swift-recon-cron" containerID="cri-o://21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.611195 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-mkrwv"] Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.637417 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-mkrwv"] Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.651130 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm"] Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.651360 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" podUID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerName="proxy-httpd" containerID="cri-o://ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69" gracePeriod=30 Jan 31 09:22:46 crc kubenswrapper[4732]: I0131 09:22:46.651486 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" podUID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerName="proxy-server" containerID="cri-o://474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe" gracePeriod=30 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.307873 4732 generic.go:334] "Generic (PLEG): container finished" podID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerID="ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.307937 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" event={"ID":"bdedfde8-2a77-4328-8d12-1ed7e7c383d7","Type":"ContainerDied","Data":"ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.312947 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.312976 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.312985 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.312993 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313001 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313008 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313015 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313021 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313028 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313034 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313040 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313046 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313052 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313059 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f" exitCode=0 Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313076 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313096 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313106 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313115 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313123 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313131 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313139 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313148 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313156 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313164 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313172 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313180 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313188 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.313196 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f"} Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.497590 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.497640 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.588263 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.709591 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-public-tls-certs\") pod \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.709638 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-config-data\") pod \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.709677 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-run-httpd\") pod \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.710048 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bdedfde8-2a77-4328-8d12-1ed7e7c383d7" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.710092 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmr65\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-kube-api-access-nmr65\") pod \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.710486 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-combined-ca-bundle\") pod \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.710535 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") pod \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.710555 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-log-httpd\") pod \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.710585 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-internal-tls-certs\") pod \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\" (UID: \"bdedfde8-2a77-4328-8d12-1ed7e7c383d7\") " Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.710896 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bdedfde8-2a77-4328-8d12-1ed7e7c383d7" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.711246 4732 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.711262 4732 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.715420 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-kube-api-access-nmr65" (OuterVolumeSpecName: "kube-api-access-nmr65") pod "bdedfde8-2a77-4328-8d12-1ed7e7c383d7" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7"). InnerVolumeSpecName "kube-api-access-nmr65". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.728855 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bdedfde8-2a77-4328-8d12-1ed7e7c383d7" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.771913 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-config-data" (OuterVolumeSpecName: "config-data") pod "bdedfde8-2a77-4328-8d12-1ed7e7c383d7" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.775445 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdedfde8-2a77-4328-8d12-1ed7e7c383d7" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.778780 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bdedfde8-2a77-4328-8d12-1ed7e7c383d7" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.812102 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.812136 4732 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.812148 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.812159 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmr65\" (UniqueName: \"kubernetes.io/projected/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-kube-api-access-nmr65\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.812168 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.814090 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bdedfde8-2a77-4328-8d12-1ed7e7c383d7" (UID: "bdedfde8-2a77-4328-8d12-1ed7e7c383d7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:22:47 crc kubenswrapper[4732]: I0131 09:22:47.913883 4732 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdedfde8-2a77-4328-8d12-1ed7e7c383d7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.325565 4732 generic.go:334] "Generic (PLEG): container finished" podID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerID="474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe" exitCode=0 Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.325607 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" event={"ID":"bdedfde8-2a77-4328-8d12-1ed7e7c383d7","Type":"ContainerDied","Data":"474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe"} Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.325638 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" event={"ID":"bdedfde8-2a77-4328-8d12-1ed7e7c383d7","Type":"ContainerDied","Data":"1f6dfd5148a93a535a200ac9b68a844f393cc7142eb0dbb00e04c6a279302eea"} Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.325685 4732 scope.go:117] "RemoveContainer" containerID="474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe" Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.325870 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm" Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.358543 4732 scope.go:117] "RemoveContainer" containerID="ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69" Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.361942 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm"] Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.368596 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-5c474fc7f4-zrlwm"] Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.377746 4732 scope.go:117] "RemoveContainer" containerID="474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe" Jan 31 09:22:48 crc kubenswrapper[4732]: E0131 09:22:48.378326 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe\": container with ID starting with 474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe not found: ID does not exist" containerID="474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe" Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.378393 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe"} err="failed to get container status \"474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe\": rpc error: code = NotFound desc = could not find container \"474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe\": container with ID starting with 474b94ec58984f199d9081036362611a027ce433aaf4a25491214d41cf2633fe not found: ID does not exist" Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.378436 4732 scope.go:117] "RemoveContainer" containerID="ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69" Jan 31 09:22:48 crc kubenswrapper[4732]: E0131 09:22:48.378966 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69\": container with ID starting with ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69 not found: ID does not exist" containerID="ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69" Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.378997 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69"} err="failed to get container status \"ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69\": rpc error: code = NotFound desc = could not find container \"ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69\": container with ID starting with ec8ba38d54836eb416cdf4f64a5064c1d48567987a1990cab21c63cec5dcea69 not found: ID does not exist" Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.556412 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84cedd57-5030-425a-8567-ceeda6aa0109" path="/var/lib/kubelet/pods/84cedd57-5030-425a-8567-ceeda6aa0109/volumes" Jan 31 09:22:48 crc kubenswrapper[4732]: I0131 09:22:48.557545 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" path="/var/lib/kubelet/pods/bdedfde8-2a77-4328-8d12-1ed7e7c383d7/volumes" Jan 31 09:23:04 crc kubenswrapper[4732]: I0131 09:23:04.176241 4732 scope.go:117] "RemoveContainer" containerID="ee0c5fb66ad6b1e6b4f42d6ec9cbbd6de131ce1fe2d36272c309841e4eaa9b41" Jan 31 09:23:04 crc kubenswrapper[4732]: I0131 09:23:04.197838 4732 scope.go:117] "RemoveContainer" containerID="a84f6527d8e718a0c39029fc384c436e9dc135176afd05bca86ac5610704533f" Jan 31 09:23:04 crc kubenswrapper[4732]: I0131 09:23:04.215719 4732 scope.go:117] "RemoveContainer" containerID="1fa43d1eeb8c60e26dee8d78abfd035753a71d50bc43bc0b9b3bff5eca6e2540" Jan 31 09:23:16 crc kubenswrapper[4732]: I0131 09:23:16.962773 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.052554 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.052653 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-combined-ca-bundle\") pod \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.052787 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") pod \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.052826 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sjw2\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-kube-api-access-7sjw2\") pod \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.052855 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-cache\") pod \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.052929 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-lock\") pod \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\" (UID: \"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3\") " Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.053766 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-cache" (OuterVolumeSpecName: "cache") pod "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.053981 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-lock" (OuterVolumeSpecName: "lock") pod "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.058641 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.059120 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-kube-api-access-7sjw2" (OuterVolumeSpecName: "kube-api-access-7sjw2") pod "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3"). InnerVolumeSpecName "kube-api-access-7sjw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.062748 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "swift") pod "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.155224 4732 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.155262 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sjw2\" (UniqueName: \"kubernetes.io/projected/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-kube-api-access-7sjw2\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.155273 4732 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-cache\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.155282 4732 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-lock\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.155310 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.168902 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.256542 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.280462 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" (UID: "ba8f0576-6adb-407c-b8e0-e4b04f0d47e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.358075 4732 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.498304 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.498378 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.498429 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.499149 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"777b6bb11b5556f90e1c2a08822928a50217112bcd9efce47de0d5e1a98e3392"} pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.499206 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" containerID="cri-o://777b6bb11b5556f90e1c2a08822928a50217112bcd9efce47de0d5e1a98e3392" gracePeriod=600 Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.583529 4732 generic.go:334] "Generic (PLEG): container finished" podID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerID="21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff" exitCode=137 Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.583573 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff"} Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.583603 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"ba8f0576-6adb-407c-b8e0-e4b04f0d47e3","Type":"ContainerDied","Data":"db289e5d4324b07ffd624072a8984010d2c9a44d1db929557bee6f1a1928fda3"} Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.583624 4732 scope.go:117] "RemoveContainer" containerID="21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.583825 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.600040 4732 scope.go:117] "RemoveContainer" containerID="5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.624620 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.633370 4732 scope.go:117] "RemoveContainer" containerID="d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.638894 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.651185 4732 scope.go:117] "RemoveContainer" containerID="17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.672287 4732 scope.go:117] "RemoveContainer" containerID="08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.734459 4732 scope.go:117] "RemoveContainer" containerID="78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.752441 4732 scope.go:117] "RemoveContainer" containerID="a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.782778 4732 scope.go:117] "RemoveContainer" containerID="92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.799511 4732 scope.go:117] "RemoveContainer" containerID="c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.814884 4732 scope.go:117] "RemoveContainer" containerID="4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.833229 4732 scope.go:117] "RemoveContainer" containerID="f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.847151 4732 scope.go:117] "RemoveContainer" containerID="13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.868081 4732 scope.go:117] "RemoveContainer" containerID="3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.887445 4732 scope.go:117] "RemoveContainer" containerID="a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.903694 4732 scope.go:117] "RemoveContainer" containerID="cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.923616 4732 scope.go:117] "RemoveContainer" containerID="21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.923995 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff\": container with ID starting with 21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff not found: ID does not exist" containerID="21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924026 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff"} err="failed to get container status \"21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff\": rpc error: code = NotFound desc = could not find container \"21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff\": container with ID starting with 21e962a9a2a9342a7ea4a394d67dccb2bcba098409718fcc20c041b19f8e8aff not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924048 4732 scope.go:117] "RemoveContainer" containerID="5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.924262 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c\": container with ID starting with 5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c not found: ID does not exist" containerID="5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924293 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c"} err="failed to get container status \"5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c\": rpc error: code = NotFound desc = could not find container \"5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c\": container with ID starting with 5274a5fdf883aabb8e1459ab78b419e45850c93bd25d640d9b574e423e14891c not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924312 4732 scope.go:117] "RemoveContainer" containerID="d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.924517 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717\": container with ID starting with d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717 not found: ID does not exist" containerID="d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924548 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717"} err="failed to get container status \"d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717\": rpc error: code = NotFound desc = could not find container \"d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717\": container with ID starting with d2fec506922e973332f3fc9bd8c7f3a11bbfe136d64d26f49002b37bbb05e717 not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924570 4732 scope.go:117] "RemoveContainer" containerID="17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.924761 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f\": container with ID starting with 17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f not found: ID does not exist" containerID="17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924782 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f"} err="failed to get container status \"17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f\": rpc error: code = NotFound desc = could not find container \"17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f\": container with ID starting with 17c4084b4095035a8bb4f76d7014f2798296fac02e063cfc7abffe90d8b7294f not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924795 4732 scope.go:117] "RemoveContainer" containerID="08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.924962 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c\": container with ID starting with 08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c not found: ID does not exist" containerID="08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924984 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c"} err="failed to get container status \"08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c\": rpc error: code = NotFound desc = could not find container \"08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c\": container with ID starting with 08aa6d73b6ce55a5fd4af30b26217cd60a1641aca38733b3a337dfbea21a313c not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.924996 4732 scope.go:117] "RemoveContainer" containerID="78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.925148 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f\": container with ID starting with 78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f not found: ID does not exist" containerID="78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.925168 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f"} err="failed to get container status \"78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f\": rpc error: code = NotFound desc = could not find container \"78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f\": container with ID starting with 78c47a8e4097580d0d62fedad615c2e7de129377d57e4dde3ee9b8817b3ebe4f not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.925179 4732 scope.go:117] "RemoveContainer" containerID="a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.925520 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee\": container with ID starting with a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee not found: ID does not exist" containerID="a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.925542 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee"} err="failed to get container status \"a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee\": rpc error: code = NotFound desc = could not find container \"a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee\": container with ID starting with a9463f09258cd249e7ac5eeabeac6f271d8661141768b16e79dc29de794bbfee not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.925555 4732 scope.go:117] "RemoveContainer" containerID="92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.925756 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee\": container with ID starting with 92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee not found: ID does not exist" containerID="92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.925776 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee"} err="failed to get container status \"92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee\": rpc error: code = NotFound desc = could not find container \"92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee\": container with ID starting with 92b05acc8700ea245d97120676710a498d03141bf1d8600bf5a37e131c2283ee not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.925788 4732 scope.go:117] "RemoveContainer" containerID="c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.925954 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03\": container with ID starting with c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03 not found: ID does not exist" containerID="c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.925975 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03"} err="failed to get container status \"c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03\": rpc error: code = NotFound desc = could not find container \"c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03\": container with ID starting with c2c74e7d8a1ebd17ed649b720d8fb589be94684e535f53dd916a2699096a6a03 not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.925993 4732 scope.go:117] "RemoveContainer" containerID="4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.926161 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32\": container with ID starting with 4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32 not found: ID does not exist" containerID="4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.926184 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32"} err="failed to get container status \"4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32\": rpc error: code = NotFound desc = could not find container \"4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32\": container with ID starting with 4b19ce1044930f4e1d8867e77cdf09526d8f2b3e3900dc1f91071adaf1162e32 not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.926199 4732 scope.go:117] "RemoveContainer" containerID="f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.926369 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922\": container with ID starting with f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922 not found: ID does not exist" containerID="f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.926389 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922"} err="failed to get container status \"f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922\": rpc error: code = NotFound desc = could not find container \"f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922\": container with ID starting with f3b6deecf746ea640b42712bfb03bb8b827bed6b84301b438ba02ffe4845a922 not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.926404 4732 scope.go:117] "RemoveContainer" containerID="13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.926580 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e\": container with ID starting with 13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e not found: ID does not exist" containerID="13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.926599 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e"} err="failed to get container status \"13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e\": rpc error: code = NotFound desc = could not find container \"13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e\": container with ID starting with 13d7343d1751fb8940398ae0071750f1f6117ba7b59e7dfa6990eb52a895303e not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.926611 4732 scope.go:117] "RemoveContainer" containerID="3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.926777 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9\": container with ID starting with 3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9 not found: ID does not exist" containerID="3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.926799 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9"} err="failed to get container status \"3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9\": rpc error: code = NotFound desc = could not find container \"3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9\": container with ID starting with 3ebca277c4b203371cd9fe26b95bfe03d230345a47fb88a2eff217b2ba6b95b9 not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.926813 4732 scope.go:117] "RemoveContainer" containerID="a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.927006 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572\": container with ID starting with a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572 not found: ID does not exist" containerID="a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.927023 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572"} err="failed to get container status \"a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572\": rpc error: code = NotFound desc = could not find container \"a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572\": container with ID starting with a26623b23e379dae3834fe0d0647fe0c758307c675c5178e21ed5e70d77db572 not found: ID does not exist" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.927037 4732 scope.go:117] "RemoveContainer" containerID="cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f" Jan 31 09:23:17 crc kubenswrapper[4732]: E0131 09:23:17.927216 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f\": container with ID starting with cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f not found: ID does not exist" containerID="cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f" Jan 31 09:23:17 crc kubenswrapper[4732]: I0131 09:23:17.927249 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f"} err="failed to get container status \"cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f\": rpc error: code = NotFound desc = could not find container \"cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f\": container with ID starting with cd4637a4eac10985ea1fdfa77841cc57b4b13086ee7a639ec9095e4b5fe6443f not found: ID does not exist" Jan 31 09:23:18 crc kubenswrapper[4732]: I0131 09:23:18.555105 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" path="/var/lib/kubelet/pods/ba8f0576-6adb-407c-b8e0-e4b04f0d47e3/volumes" Jan 31 09:23:18 crc kubenswrapper[4732]: I0131 09:23:18.597515 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d790207-d357-4b47-87bf-5b505e061820" containerID="777b6bb11b5556f90e1c2a08822928a50217112bcd9efce47de0d5e1a98e3392" exitCode=0 Jan 31 09:23:18 crc kubenswrapper[4732]: I0131 09:23:18.597564 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerDied","Data":"777b6bb11b5556f90e1c2a08822928a50217112bcd9efce47de0d5e1a98e3392"} Jan 31 09:23:18 crc kubenswrapper[4732]: I0131 09:23:18.597599 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77"} Jan 31 09:23:18 crc kubenswrapper[4732]: I0131 09:23:18.597620 4732 scope.go:117] "RemoveContainer" containerID="99271a603de3a603b9be8d8f0bb791de0f202646de27403a5e3efc59790f637f" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.067677 4732 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.231:50112->38.129.56.231:32957: write tcp 38.129.56.231:50112->38.129.56.231:32957: write: broken pipe Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.547574 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-482fl"] Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.560123 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-482fl"] Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602446 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbicanf734-account-delete-g264z"] Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602717 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-auditor" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602728 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-auditor" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602746 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="rsync" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602756 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="rsync" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602766 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-reaper" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602773 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-reaper" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602781 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-updater" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602788 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-updater" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602799 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-replicator" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602807 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-replicator" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602817 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-server" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602822 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-server" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602832 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-expirer" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602838 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-expirer" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602845 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerName="proxy-httpd" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602852 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerName="proxy-httpd" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602864 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-updater" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602869 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-updater" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602882 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-server" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602887 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-server" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602897 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-replicator" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602903 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-replicator" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602909 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-auditor" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602915 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-auditor" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602926 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-auditor" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602931 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-auditor" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602940 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerName="proxy-server" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602946 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerName="proxy-server" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602953 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-replicator" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602959 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-replicator" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602968 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="swift-recon-cron" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602974 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="swift-recon-cron" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602984 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-server" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.602990 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-server" Jan 31 09:23:23 crc kubenswrapper[4732]: E0131 09:23:23.602997 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84cedd57-5030-425a-8567-ceeda6aa0109" containerName="swift-ring-rebalance" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603003 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cedd57-5030-425a-8567-ceeda6aa0109" containerName="swift-ring-rebalance" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603113 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-server" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603127 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="84cedd57-5030-425a-8567-ceeda6aa0109" containerName="swift-ring-rebalance" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603134 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-replicator" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603141 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-server" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603153 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerName="proxy-httpd" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603161 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdedfde8-2a77-4328-8d12-1ed7e7c383d7" containerName="proxy-server" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603171 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-auditor" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603178 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-server" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603186 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-replicator" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603193 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-auditor" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603200 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="account-reaper" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603207 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-updater" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603215 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-auditor" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603223 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="container-updater" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603230 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="rsync" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603238 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-replicator" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603245 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="object-expirer" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603255 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f0576-6adb-407c-b8e0-e4b04f0d47e3" containerName="swift-recon-cron" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.603714 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.611220 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbicanf734-account-delete-g264z"] Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.622646 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-worker-597f49d4f-8nh24"] Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.622950 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" podUID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerName="barbican-worker-log" containerID="cri-o://61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962" gracePeriod=30 Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.623087 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" podUID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerName="barbican-worker" containerID="cri-o://9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf" gracePeriod=30 Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.665645 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d550c8-968a-4962-9e23-c0c22911913d-operator-scripts\") pod \"barbicanf734-account-delete-g264z\" (UID: \"e5d550c8-968a-4962-9e23-c0c22911913d\") " pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.666251 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv56f\" (UniqueName: \"kubernetes.io/projected/e5d550c8-968a-4962-9e23-c0c22911913d-kube-api-access-nv56f\") pod \"barbicanf734-account-delete-g264z\" (UID: \"e5d550c8-968a-4962-9e23-c0c22911913d\") " pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.669481 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq"] Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.669828 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" podUID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerName="barbican-api-log" containerID="cri-o://2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1" gracePeriod=30 Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.670148 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" podUID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerName="barbican-api" containerID="cri-o://341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8" gracePeriod=30 Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.689204 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8"] Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.689481 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" podUID="7728f3b2-7258-444d-982b-10d416bb61f0" containerName="barbican-keystone-listener-log" containerID="cri-o://d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb" gracePeriod=30 Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.690028 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" podUID="7728f3b2-7258-444d-982b-10d416bb61f0" containerName="barbican-keystone-listener" containerID="cri-o://e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2" gracePeriod=30 Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.769340 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv56f\" (UniqueName: \"kubernetes.io/projected/e5d550c8-968a-4962-9e23-c0c22911913d-kube-api-access-nv56f\") pod \"barbicanf734-account-delete-g264z\" (UID: \"e5d550c8-968a-4962-9e23-c0c22911913d\") " pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.769476 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d550c8-968a-4962-9e23-c0c22911913d-operator-scripts\") pod \"barbicanf734-account-delete-g264z\" (UID: \"e5d550c8-968a-4962-9e23-c0c22911913d\") " pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.771140 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d550c8-968a-4962-9e23-c0c22911913d-operator-scripts\") pod \"barbicanf734-account-delete-g264z\" (UID: \"e5d550c8-968a-4962-9e23-c0c22911913d\") " pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.809388 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv56f\" (UniqueName: \"kubernetes.io/projected/e5d550c8-968a-4962-9e23-c0c22911913d-kube-api-access-nv56f\") pod \"barbicanf734-account-delete-g264z\" (UID: \"e5d550c8-968a-4962-9e23-c0c22911913d\") " pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:23 crc kubenswrapper[4732]: I0131 09:23:23.932903 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.332473 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbicanf734-account-delete-g264z"] Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.550088 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52cae0eb-413d-4365-a717-8039a3e3b99f" path="/var/lib/kubelet/pods/52cae0eb-413d-4365-a717-8039a3e3b99f/volumes" Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.653277 4732 generic.go:334] "Generic (PLEG): container finished" podID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerID="2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1" exitCode=143 Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.653350 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" event={"ID":"4981d9a9-898f-49ff-809d-58c7ca3bd2a3","Type":"ContainerDied","Data":"2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1"} Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.654527 4732 generic.go:334] "Generic (PLEG): container finished" podID="e5d550c8-968a-4962-9e23-c0c22911913d" containerID="def72532b530c78803f0cccd8c5a2d65a616e430cf1d92043ad3623133222585" exitCode=0 Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.654598 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" event={"ID":"e5d550c8-968a-4962-9e23-c0c22911913d","Type":"ContainerDied","Data":"def72532b530c78803f0cccd8c5a2d65a616e430cf1d92043ad3623133222585"} Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.654629 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" event={"ID":"e5d550c8-968a-4962-9e23-c0c22911913d","Type":"ContainerStarted","Data":"146893e4283a7dc87ac374f210d6e9d776b02509f4ed67945e0b2dcbb18bfb2f"} Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.656643 4732 generic.go:334] "Generic (PLEG): container finished" podID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerID="61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962" exitCode=143 Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.656690 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" event={"ID":"d3b1cc40-9985-45d8-bb06-0676ff188c6c","Type":"ContainerDied","Data":"61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962"} Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.658098 4732 generic.go:334] "Generic (PLEG): container finished" podID="7728f3b2-7258-444d-982b-10d416bb61f0" containerID="d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb" exitCode=143 Jan 31 09:23:24 crc kubenswrapper[4732]: I0131 09:23:24.658123 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" event={"ID":"7728f3b2-7258-444d-982b-10d416bb61f0","Type":"ContainerDied","Data":"d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb"} Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.015339 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-r757b"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.022169 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-vj5z4"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.032063 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-vj5z4"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.050439 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-r757b"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.063335 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-7959cb4f8b-4vxsg"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.063530 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" podUID="1ee84530-efd7-4d83-9aa2-fb9b8b178496" containerName="keystone-api" containerID="cri-o://2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932" gracePeriod=30 Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.080894 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone5b86-account-delete-qkt4p"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.081693 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.087061 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssz6z\" (UniqueName: \"kubernetes.io/projected/1654407d-7276-4839-839d-1244759c4ad2-kube-api-access-ssz6z\") pod \"keystone5b86-account-delete-qkt4p\" (UID: \"1654407d-7276-4839-839d-1244759c4ad2\") " pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.087108 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts\") pod \"keystone5b86-account-delete-qkt4p\" (UID: \"1654407d-7276-4839-839d-1244759c4ad2\") " pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.093141 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone5b86-account-delete-qkt4p"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.187995 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssz6z\" (UniqueName: \"kubernetes.io/projected/1654407d-7276-4839-839d-1244759c4ad2-kube-api-access-ssz6z\") pod \"keystone5b86-account-delete-qkt4p\" (UID: \"1654407d-7276-4839-839d-1244759c4ad2\") " pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.188045 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts\") pod \"keystone5b86-account-delete-qkt4p\" (UID: \"1654407d-7276-4839-839d-1244759c4ad2\") " pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.188805 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts\") pod \"keystone5b86-account-delete-qkt4p\" (UID: \"1654407d-7276-4839-839d-1244759c4ad2\") " pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.204945 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.209483 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssz6z\" (UniqueName: \"kubernetes.io/projected/1654407d-7276-4839-839d-1244759c4ad2-kube-api-access-ssz6z\") pod \"keystone5b86-account-delete-qkt4p\" (UID: \"1654407d-7276-4839-839d-1244759c4ad2\") " pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.389375 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7728f3b2-7258-444d-982b-10d416bb61f0-logs\") pod \"7728f3b2-7258-444d-982b-10d416bb61f0\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.389480 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data-custom\") pod \"7728f3b2-7258-444d-982b-10d416bb61f0\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.389553 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bkhk\" (UniqueName: \"kubernetes.io/projected/7728f3b2-7258-444d-982b-10d416bb61f0-kube-api-access-6bkhk\") pod \"7728f3b2-7258-444d-982b-10d416bb61f0\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.389577 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data\") pod \"7728f3b2-7258-444d-982b-10d416bb61f0\" (UID: \"7728f3b2-7258-444d-982b-10d416bb61f0\") " Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.390015 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7728f3b2-7258-444d-982b-10d416bb61f0-logs" (OuterVolumeSpecName: "logs") pod "7728f3b2-7258-444d-982b-10d416bb61f0" (UID: "7728f3b2-7258-444d-982b-10d416bb61f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.392839 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7728f3b2-7258-444d-982b-10d416bb61f0-kube-api-access-6bkhk" (OuterVolumeSpecName: "kube-api-access-6bkhk") pod "7728f3b2-7258-444d-982b-10d416bb61f0" (UID: "7728f3b2-7258-444d-982b-10d416bb61f0"). InnerVolumeSpecName "kube-api-access-6bkhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.393261 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7728f3b2-7258-444d-982b-10d416bb61f0" (UID: "7728f3b2-7258-444d-982b-10d416bb61f0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.405865 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.442429 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data" (OuterVolumeSpecName: "config-data") pod "7728f3b2-7258-444d-982b-10d416bb61f0" (UID: "7728f3b2-7258-444d-982b-10d416bb61f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.490746 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7728f3b2-7258-444d-982b-10d416bb61f0-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.490769 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.490778 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bkhk\" (UniqueName: \"kubernetes.io/projected/7728f3b2-7258-444d-982b-10d416bb61f0-kube-api-access-6bkhk\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.490786 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7728f3b2-7258-444d-982b-10d416bb61f0-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.657876 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone5b86-account-delete-qkt4p"] Jan 31 09:23:25 crc kubenswrapper[4732]: W0131 09:23:25.662186 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1654407d_7276_4839_839d_1244759c4ad2.slice/crio-7d6e2907463735b69f40b7370fe0069cdb743cf23d1f9aeb0278bee0ffa6f8b0 WatchSource:0}: Error finding container 7d6e2907463735b69f40b7370fe0069cdb743cf23d1f9aeb0278bee0ffa6f8b0: Status 404 returned error can't find the container with id 7d6e2907463735b69f40b7370fe0069cdb743cf23d1f9aeb0278bee0ffa6f8b0 Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.666213 4732 generic.go:334] "Generic (PLEG): container finished" podID="7728f3b2-7258-444d-982b-10d416bb61f0" containerID="e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2" exitCode=0 Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.666289 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" event={"ID":"7728f3b2-7258-444d-982b-10d416bb61f0","Type":"ContainerDied","Data":"e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2"} Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.666328 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" event={"ID":"7728f3b2-7258-444d-982b-10d416bb61f0","Type":"ContainerDied","Data":"5117dba79725a077cd6ab2eb2327b8a7f198da1d9f337e252448199e446feb3a"} Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.666348 4732 scope.go:117] "RemoveContainer" containerID="e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.666424 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.691027 4732 scope.go:117] "RemoveContainer" containerID="d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.705367 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.712723 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-5f5b7fdb46-c8wb8"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.726306 4732 scope.go:117] "RemoveContainer" containerID="e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2" Jan 31 09:23:25 crc kubenswrapper[4732]: E0131 09:23:25.726830 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2\": container with ID starting with e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2 not found: ID does not exist" containerID="e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.726877 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2"} err="failed to get container status \"e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2\": rpc error: code = NotFound desc = could not find container \"e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2\": container with ID starting with e43f0104f4b5c35da143cfe8b1821aeb80b3326882a44cc53de6938465a118a2 not found: ID does not exist" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.726904 4732 scope.go:117] "RemoveContainer" containerID="d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb" Jan 31 09:23:25 crc kubenswrapper[4732]: E0131 09:23:25.730172 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb\": container with ID starting with d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb not found: ID does not exist" containerID="d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.730234 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb"} err="failed to get container status \"d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb\": rpc error: code = NotFound desc = could not find container \"d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb\": container with ID starting with d4d07af4bd1a6391d336e9eb08f85f38b781a53ac9cbe6e99069d22ae367b8eb not found: ID does not exist" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.810414 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/root-account-create-update-pjmfd"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.821407 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/root-account-create-update-pjmfd"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.860815 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/root-account-create-update-x22v8"] Jan 31 09:23:25 crc kubenswrapper[4732]: E0131 09:23:25.861199 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7728f3b2-7258-444d-982b-10d416bb61f0" containerName="barbican-keystone-listener" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.861214 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7728f3b2-7258-444d-982b-10d416bb61f0" containerName="barbican-keystone-listener" Jan 31 09:23:25 crc kubenswrapper[4732]: E0131 09:23:25.861230 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7728f3b2-7258-444d-982b-10d416bb61f0" containerName="barbican-keystone-listener-log" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.861238 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7728f3b2-7258-444d-982b-10d416bb61f0" containerName="barbican-keystone-listener-log" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.861410 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7728f3b2-7258-444d-982b-10d416bb61f0" containerName="barbican-keystone-listener-log" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.861424 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7728f3b2-7258-444d-982b-10d416bb61f0" containerName="barbican-keystone-listener" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.862009 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.864012 4732 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.869732 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.878143 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.896259 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/root-account-create-update-x22v8"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.901591 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.933486 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/root-account-create-update-x22v8"] Jan 31 09:23:25 crc kubenswrapper[4732]: E0131 09:23:25.933968 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-4wpw4 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="swift-kuttl-tests/root-account-create-update-x22v8" podUID="17f70688-a1f8-4465-821b-48f5381ff96c" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.979855 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.997364 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wpw4\" (UniqueName: \"kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4\") pod \"root-account-create-update-x22v8\" (UID: \"17f70688-a1f8-4465-821b-48f5381ff96c\") " pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:25 crc kubenswrapper[4732]: I0131 09:23:25.997423 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts\") pod \"root-account-create-update-x22v8\" (UID: \"17f70688-a1f8-4465-821b-48f5381ff96c\") " pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.050362 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/openstack-galera-2" podUID="f7eb0179-b292-4a09-a07d-3d9bfe7978f3" containerName="galera" containerID="cri-o://bf4dd15bf1d89a707a1cc67b69c360511a62db3a9d76bcd538b729b470197a50" gracePeriod=30 Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.098502 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv56f\" (UniqueName: \"kubernetes.io/projected/e5d550c8-968a-4962-9e23-c0c22911913d-kube-api-access-nv56f\") pod \"e5d550c8-968a-4962-9e23-c0c22911913d\" (UID: \"e5d550c8-968a-4962-9e23-c0c22911913d\") " Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.098692 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d550c8-968a-4962-9e23-c0c22911913d-operator-scripts\") pod \"e5d550c8-968a-4962-9e23-c0c22911913d\" (UID: \"e5d550c8-968a-4962-9e23-c0c22911913d\") " Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.099024 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpw4\" (UniqueName: \"kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4\") pod \"root-account-create-update-x22v8\" (UID: \"17f70688-a1f8-4465-821b-48f5381ff96c\") " pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.099065 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts\") pod \"root-account-create-update-x22v8\" (UID: \"17f70688-a1f8-4465-821b-48f5381ff96c\") " pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.099188 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.099258 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts podName:17f70688-a1f8-4465-821b-48f5381ff96c nodeName:}" failed. No retries permitted until 2026-01-31 09:23:26.599240079 +0000 UTC m=+1344.905116283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts") pod "root-account-create-update-x22v8" (UID: "17f70688-a1f8-4465-821b-48f5381ff96c") : configmap "openstack-scripts" not found Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.099281 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d550c8-968a-4962-9e23-c0c22911913d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5d550c8-968a-4962-9e23-c0c22911913d" (UID: "e5d550c8-968a-4962-9e23-c0c22911913d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.105046 4732 projected.go:194] Error preparing data for projected volume kube-api-access-4wpw4 for pod swift-kuttl-tests/root-account-create-update-x22v8: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.105129 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4 podName:17f70688-a1f8-4465-821b-48f5381ff96c nodeName:}" failed. No retries permitted until 2026-01-31 09:23:26.605106761 +0000 UTC m=+1344.910983025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4wpw4" (UniqueName: "kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4") pod "root-account-create-update-x22v8" (UID: "17f70688-a1f8-4465-821b-48f5381ff96c") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.110906 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d550c8-968a-4962-9e23-c0c22911913d-kube-api-access-nv56f" (OuterVolumeSpecName: "kube-api-access-nv56f") pod "e5d550c8-968a-4962-9e23-c0c22911913d" (UID: "e5d550c8-968a-4962-9e23-c0c22911913d"). InnerVolumeSpecName "kube-api-access-nv56f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.200819 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv56f\" (UniqueName: \"kubernetes.io/projected/e5d550c8-968a-4962-9e23-c0c22911913d-kube-api-access-nv56f\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.200850 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d550c8-968a-4962-9e23-c0c22911913d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.410523 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/memcached-0"] Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.410776 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/memcached-0" podUID="c0d4fa62-a33c-4ab2-a446-697994c1541e" containerName="memcached" containerID="cri-o://3bead2528c2ca5acd2f850812b4f5c0dc70486b1bf295a4a4a607b3a99c2b59e" gracePeriod=30 Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.549577 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1568a5da-d308-4b7e-94b6-99c846371cb8" path="/var/lib/kubelet/pods/1568a5da-d308-4b7e-94b6-99c846371cb8/volumes" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.550496 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d8a630-8f89-44aa-9f24-2f1b279cccfd" path="/var/lib/kubelet/pods/51d8a630-8f89-44aa-9f24-2f1b279cccfd/volumes" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.551005 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a47d3d-88f0-48b4-b672-9b224ead785f" path="/var/lib/kubelet/pods/65a47d3d-88f0-48b4-b672-9b224ead785f/volumes" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.551981 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7728f3b2-7258-444d-982b-10d416bb61f0" path="/var/lib/kubelet/pods/7728f3b2-7258-444d-982b-10d416bb61f0/volumes" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.605984 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpw4\" (UniqueName: \"kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4\") pod \"root-account-create-update-x22v8\" (UID: \"17f70688-a1f8-4465-821b-48f5381ff96c\") " pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.606044 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts\") pod \"root-account-create-update-x22v8\" (UID: \"17f70688-a1f8-4465-821b-48f5381ff96c\") " pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.606181 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.606236 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts podName:17f70688-a1f8-4465-821b-48f5381ff96c nodeName:}" failed. No retries permitted until 2026-01-31 09:23:27.606217918 +0000 UTC m=+1345.912094122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts") pod "root-account-create-update-x22v8" (UID: "17f70688-a1f8-4465-821b-48f5381ff96c") : configmap "openstack-scripts" not found Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.609082 4732 projected.go:194] Error preparing data for projected volume kube-api-access-4wpw4 for pod swift-kuttl-tests/root-account-create-update-x22v8: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.609135 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4 podName:17f70688-a1f8-4465-821b-48f5381ff96c nodeName:}" failed. No retries permitted until 2026-01-31 09:23:27.609122918 +0000 UTC m=+1345.914999122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4wpw4" (UniqueName: "kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4") pod "root-account-create-update-x22v8" (UID: "17f70688-a1f8-4465-821b-48f5381ff96c") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.680153 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" event={"ID":"1654407d-7276-4839-839d-1244759c4ad2","Type":"ContainerStarted","Data":"79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069"} Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.680197 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" event={"ID":"1654407d-7276-4839-839d-1244759c4ad2","Type":"ContainerStarted","Data":"7d6e2907463735b69f40b7370fe0069cdb743cf23d1f9aeb0278bee0ffa6f8b0"} Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.680535 4732 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" secret="" err="secret \"galera-openstack-dockercfg-4btb4\" not found" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.686007 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" event={"ID":"e5d550c8-968a-4962-9e23-c0c22911913d","Type":"ContainerDied","Data":"146893e4283a7dc87ac374f210d6e9d776b02509f4ed67945e0b2dcbb18bfb2f"} Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.686050 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="146893e4283a7dc87ac374f210d6e9d776b02509f4ed67945e0b2dcbb18bfb2f" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.686116 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbicanf734-account-delete-g264z" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.688120 4732 generic.go:334] "Generic (PLEG): container finished" podID="f7eb0179-b292-4a09-a07d-3d9bfe7978f3" containerID="bf4dd15bf1d89a707a1cc67b69c360511a62db3a9d76bcd538b729b470197a50" exitCode=0 Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.688194 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"f7eb0179-b292-4a09-a07d-3d9bfe7978f3","Type":"ContainerDied","Data":"bf4dd15bf1d89a707a1cc67b69c360511a62db3a9d76bcd538b729b470197a50"} Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.689221 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.699263 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" podStartSLOduration=1.6992430600000001 podStartE2EDuration="1.69924306s" podCreationTimestamp="2026-01-31 09:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:23:26.694269356 +0000 UTC m=+1345.000145560" watchObservedRunningTime="2026-01-31 09:23:26.69924306 +0000 UTC m=+1345.005119274" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.699686 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.785093 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.810155 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:26 crc kubenswrapper[4732]: E0131 09:23:26.810225 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts podName:1654407d-7276-4839-839d-1244759c4ad2 nodeName:}" failed. No retries permitted until 2026-01-31 09:23:27.310209338 +0000 UTC m=+1345.616085542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts") pod "keystone5b86-account-delete-qkt4p" (UID: "1654407d-7276-4839-839d-1244759c4ad2") : configmap "openstack-scripts" not found Jan 31 09:23:26 crc kubenswrapper[4732]: I0131 09:23:26.936577 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.011428 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.011868 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-operator-scripts\") pod \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.011995 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjncs\" (UniqueName: \"kubernetes.io/projected/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kube-api-access-gjncs\") pod \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.012046 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-generated\") pod \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.012097 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kolla-config\") pod \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.012126 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-default\") pod \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\" (UID: \"f7eb0179-b292-4a09-a07d-3d9bfe7978f3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.012850 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "f7eb0179-b292-4a09-a07d-3d9bfe7978f3" (UID: "f7eb0179-b292-4a09-a07d-3d9bfe7978f3"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.012977 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "f7eb0179-b292-4a09-a07d-3d9bfe7978f3" (UID: "f7eb0179-b292-4a09-a07d-3d9bfe7978f3"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.013388 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "f7eb0179-b292-4a09-a07d-3d9bfe7978f3" (UID: "f7eb0179-b292-4a09-a07d-3d9bfe7978f3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.013791 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7eb0179-b292-4a09-a07d-3d9bfe7978f3" (UID: "f7eb0179-b292-4a09-a07d-3d9bfe7978f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.018571 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kube-api-access-gjncs" (OuterVolumeSpecName: "kube-api-access-gjncs") pod "f7eb0179-b292-4a09-a07d-3d9bfe7978f3" (UID: "f7eb0179-b292-4a09-a07d-3d9bfe7978f3"). InnerVolumeSpecName "kube-api-access-gjncs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.022484 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "f7eb0179-b292-4a09-a07d-3d9bfe7978f3" (UID: "f7eb0179-b292-4a09-a07d-3d9bfe7978f3"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.113893 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjncs\" (UniqueName: \"kubernetes.io/projected/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kube-api-access-gjncs\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.113920 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.113929 4732 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.113953 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.113979 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.113991 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7eb0179-b292-4a09-a07d-3d9bfe7978f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.127523 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.162169 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.221508 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: E0131 09:23:27.322455 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:27 crc kubenswrapper[4732]: E0131 09:23:27.322520 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts podName:1654407d-7276-4839-839d-1244759c4ad2 nodeName:}" failed. No retries permitted until 2026-01-31 09:23:28.322504962 +0000 UTC m=+1346.628381166 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts") pod "keystone5b86-account-delete-qkt4p" (UID: "1654407d-7276-4839-839d-1244759c4ad2") : configmap "openstack-scripts" not found Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.330176 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.423892 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data-custom\") pod \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.423970 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3b1cc40-9985-45d8-bb06-0676ff188c6c-logs\") pod \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.424002 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c684\" (UniqueName: \"kubernetes.io/projected/d3b1cc40-9985-45d8-bb06-0676ff188c6c-kube-api-access-5c684\") pod \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.424097 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data\") pod \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\" (UID: \"d3b1cc40-9985-45d8-bb06-0676ff188c6c\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.425439 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3b1cc40-9985-45d8-bb06-0676ff188c6c-logs" (OuterVolumeSpecName: "logs") pod "d3b1cc40-9985-45d8-bb06-0676ff188c6c" (UID: "d3b1cc40-9985-45d8-bb06-0676ff188c6c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.429655 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d3b1cc40-9985-45d8-bb06-0676ff188c6c" (UID: "d3b1cc40-9985-45d8-bb06-0676ff188c6c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.429887 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3b1cc40-9985-45d8-bb06-0676ff188c6c-kube-api-access-5c684" (OuterVolumeSpecName: "kube-api-access-5c684") pod "d3b1cc40-9985-45d8-bb06-0676ff188c6c" (UID: "d3b1cc40-9985-45d8-bb06-0676ff188c6c"). InnerVolumeSpecName "kube-api-access-5c684". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.457896 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data" (OuterVolumeSpecName: "config-data") pod "d3b1cc40-9985-45d8-bb06-0676ff188c6c" (UID: "d3b1cc40-9985-45d8-bb06-0676ff188c6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.464438 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.527066 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.527109 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3b1cc40-9985-45d8-bb06-0676ff188c6c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.527123 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3b1cc40-9985-45d8-bb06-0676ff188c6c-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.527134 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c684\" (UniqueName: \"kubernetes.io/projected/d3b1cc40-9985-45d8-bb06-0676ff188c6c-kube-api-access-5c684\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.628097 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data\") pod \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.628142 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data-custom\") pod \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.628172 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lp6p\" (UniqueName: \"kubernetes.io/projected/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-kube-api-access-5lp6p\") pod \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.628217 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-logs\") pod \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\" (UID: \"4981d9a9-898f-49ff-809d-58c7ca3bd2a3\") " Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.628438 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpw4\" (UniqueName: \"kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4\") pod \"root-account-create-update-x22v8\" (UID: \"17f70688-a1f8-4465-821b-48f5381ff96c\") " pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.628473 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts\") pod \"root-account-create-update-x22v8\" (UID: \"17f70688-a1f8-4465-821b-48f5381ff96c\") " pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:27 crc kubenswrapper[4732]: E0131 09:23:27.628651 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:27 crc kubenswrapper[4732]: E0131 09:23:27.628732 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts podName:17f70688-a1f8-4465-821b-48f5381ff96c nodeName:}" failed. No retries permitted until 2026-01-31 09:23:29.62871158 +0000 UTC m=+1347.934587784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts") pod "root-account-create-update-x22v8" (UID: "17f70688-a1f8-4465-821b-48f5381ff96c") : configmap "openstack-scripts" not found Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.628947 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-logs" (OuterVolumeSpecName: "logs") pod "4981d9a9-898f-49ff-809d-58c7ca3bd2a3" (UID: "4981d9a9-898f-49ff-809d-58c7ca3bd2a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.631214 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-kube-api-access-5lp6p" (OuterVolumeSpecName: "kube-api-access-5lp6p") pod "4981d9a9-898f-49ff-809d-58c7ca3bd2a3" (UID: "4981d9a9-898f-49ff-809d-58c7ca3bd2a3"). InnerVolumeSpecName "kube-api-access-5lp6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.632113 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4981d9a9-898f-49ff-809d-58c7ca3bd2a3" (UID: "4981d9a9-898f-49ff-809d-58c7ca3bd2a3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: E0131 09:23:27.634144 4732 projected.go:194] Error preparing data for projected volume kube-api-access-4wpw4 for pod swift-kuttl-tests/root-account-create-update-x22v8: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 09:23:27 crc kubenswrapper[4732]: E0131 09:23:27.634195 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4 podName:17f70688-a1f8-4465-821b-48f5381ff96c nodeName:}" failed. No retries permitted until 2026-01-31 09:23:29.634177339 +0000 UTC m=+1347.940053543 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4wpw4" (UniqueName: "kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4") pod "root-account-create-update-x22v8" (UID: "17f70688-a1f8-4465-821b-48f5381ff96c") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.662864 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data" (OuterVolumeSpecName: "config-data") pod "4981d9a9-898f-49ff-809d-58c7ca3bd2a3" (UID: "4981d9a9-898f-49ff-809d-58c7ca3bd2a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.709231 4732 generic.go:334] "Generic (PLEG): container finished" podID="c0d4fa62-a33c-4ab2-a446-697994c1541e" containerID="3bead2528c2ca5acd2f850812b4f5c0dc70486b1bf295a4a4a607b3a99c2b59e" exitCode=0 Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.709303 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"c0d4fa62-a33c-4ab2-a446-697994c1541e","Type":"ContainerDied","Data":"3bead2528c2ca5acd2f850812b4f5c0dc70486b1bf295a4a4a607b3a99c2b59e"} Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.712974 4732 generic.go:334] "Generic (PLEG): container finished" podID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerID="341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8" exitCode=0 Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.713011 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" event={"ID":"4981d9a9-898f-49ff-809d-58c7ca3bd2a3","Type":"ContainerDied","Data":"341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8"} Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.713046 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" event={"ID":"4981d9a9-898f-49ff-809d-58c7ca3bd2a3","Type":"ContainerDied","Data":"dc8a708608424d3770f138159a52c830a7b371ad68ddd44083bf847d118f3337"} Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.713065 4732 scope.go:117] "RemoveContainer" containerID="341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.713086 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.729885 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.729916 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.729930 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lp6p\" (UniqueName: \"kubernetes.io/projected/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-kube-api-access-5lp6p\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.729942 4732 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4981d9a9-898f-49ff-809d-58c7ca3bd2a3-logs\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.739878 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.740383 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"f7eb0179-b292-4a09-a07d-3d9bfe7978f3","Type":"ContainerDied","Data":"a51f6b605ece49b9fb83731b6d6e01366d3ebfefee0aa6fe01b529f67148475c"} Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.748246 4732 scope.go:117] "RemoveContainer" containerID="2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.754451 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq"] Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.755572 4732 generic.go:334] "Generic (PLEG): container finished" podID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerID="9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf" exitCode=0 Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.756035 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" event={"ID":"d3b1cc40-9985-45d8-bb06-0676ff188c6c","Type":"ContainerDied","Data":"9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf"} Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.756081 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" event={"ID":"d3b1cc40-9985-45d8-bb06-0676ff188c6c","Type":"ContainerDied","Data":"349666c65ccee57c71b989263a2d6e9c2979da105d5b7e91fe8eeec62ef91646"} Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.756082 4732 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" secret="" err="secret \"galera-openstack-dockercfg-4btb4\" not found" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.756134 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-597f49d4f-8nh24" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.756335 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-x22v8" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.762372 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-api-5bb4486f46-rmsgq"] Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.772984 4732 scope.go:117] "RemoveContainer" containerID="341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8" Jan 31 09:23:27 crc kubenswrapper[4732]: E0131 09:23:27.773452 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8\": container with ID starting with 341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8 not found: ID does not exist" containerID="341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.773513 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8"} err="failed to get container status \"341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8\": rpc error: code = NotFound desc = could not find container \"341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8\": container with ID starting with 341aa4e395d7929014e446a8184c4ab222c0434acdf71dbf1596f5fc5c67f4a8 not found: ID does not exist" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.773546 4732 scope.go:117] "RemoveContainer" containerID="2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1" Jan 31 09:23:27 crc kubenswrapper[4732]: E0131 09:23:27.774424 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1\": container with ID starting with 2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1 not found: ID does not exist" containerID="2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.774459 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1"} err="failed to get container status \"2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1\": rpc error: code = NotFound desc = could not find container \"2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1\": container with ID starting with 2d6ada7bb6e0aa8ec000e73dc78802992d92ff017f59a57893472c54d628fcd1 not found: ID does not exist" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.774480 4732 scope.go:117] "RemoveContainer" containerID="bf4dd15bf1d89a707a1cc67b69c360511a62db3a9d76bcd538b729b470197a50" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.813113 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.820294 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.825409 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/rabbitmq-server-0" podUID="dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" containerName="rabbitmq" containerID="cri-o://5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d" gracePeriod=604800 Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.830845 4732 scope.go:117] "RemoveContainer" containerID="e5b6220dc9c37c2ae0ded221270ed7b6d934a81efa02b526750d6847a367f930" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.835803 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-worker-597f49d4f-8nh24"] Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.841689 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-worker-597f49d4f-8nh24"] Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.860203 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/root-account-create-update-x22v8"] Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.867949 4732 scope.go:117] "RemoveContainer" containerID="9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.869381 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/root-account-create-update-x22v8"] Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.956757 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Jan 31 09:23:27 crc kubenswrapper[4732]: I0131 09:23:27.977547 4732 scope.go:117] "RemoveContainer" containerID="61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.042002 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvmz6\" (UniqueName: \"kubernetes.io/projected/c0d4fa62-a33c-4ab2-a446-697994c1541e-kube-api-access-rvmz6\") pod \"c0d4fa62-a33c-4ab2-a446-697994c1541e\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.042037 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-kolla-config\") pod \"c0d4fa62-a33c-4ab2-a446-697994c1541e\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.042064 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-config-data\") pod \"c0d4fa62-a33c-4ab2-a446-697994c1541e\" (UID: \"c0d4fa62-a33c-4ab2-a446-697994c1541e\") " Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.042319 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17f70688-a1f8-4465-821b-48f5381ff96c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.042329 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wpw4\" (UniqueName: \"kubernetes.io/projected/17f70688-a1f8-4465-821b-48f5381ff96c-kube-api-access-4wpw4\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.042758 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-config-data" (OuterVolumeSpecName: "config-data") pod "c0d4fa62-a33c-4ab2-a446-697994c1541e" (UID: "c0d4fa62-a33c-4ab2-a446-697994c1541e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.043420 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "c0d4fa62-a33c-4ab2-a446-697994c1541e" (UID: "c0d4fa62-a33c-4ab2-a446-697994c1541e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.045342 4732 scope.go:117] "RemoveContainer" containerID="9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf" Jan 31 09:23:28 crc kubenswrapper[4732]: E0131 09:23:28.048974 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf\": container with ID starting with 9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf not found: ID does not exist" containerID="9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.049010 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf"} err="failed to get container status \"9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf\": rpc error: code = NotFound desc = could not find container \"9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf\": container with ID starting with 9bf2a11eb61aee168266496fbc63cf87fd0863d3de9bd60d727904eb9b9f54bf not found: ID does not exist" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.049034 4732 scope.go:117] "RemoveContainer" containerID="61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962" Jan 31 09:23:28 crc kubenswrapper[4732]: E0131 09:23:28.050624 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962\": container with ID starting with 61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962 not found: ID does not exist" containerID="61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.050644 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962"} err="failed to get container status \"61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962\": rpc error: code = NotFound desc = could not find container \"61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962\": container with ID starting with 61654a37d88891f02e083badeb8b5cfe93229dadee0e8b7d4de9237b19518962 not found: ID does not exist" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.062057 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d4fa62-a33c-4ab2-a446-697994c1541e-kube-api-access-rvmz6" (OuterVolumeSpecName: "kube-api-access-rvmz6") pod "c0d4fa62-a33c-4ab2-a446-697994c1541e" (UID: "c0d4fa62-a33c-4ab2-a446-697994c1541e"). InnerVolumeSpecName "kube-api-access-rvmz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.095396 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/openstack-galera-1" podUID="0682a582-79d6-4286-9a43-e4a258dde73f" containerName="galera" containerID="cri-o://ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e" gracePeriod=28 Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.143717 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvmz6\" (UniqueName: \"kubernetes.io/projected/c0d4fa62-a33c-4ab2-a446-697994c1541e-kube-api-access-rvmz6\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.143748 4732 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.143759 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0d4fa62-a33c-4ab2-a446-697994c1541e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: E0131 09:23:28.346779 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:28 crc kubenswrapper[4732]: E0131 09:23:28.346843 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts podName:1654407d-7276-4839-839d-1244759c4ad2 nodeName:}" failed. No retries permitted until 2026-01-31 09:23:30.346827901 +0000 UTC m=+1348.652704105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts") pod "keystone5b86-account-delete-qkt4p" (UID: "1654407d-7276-4839-839d-1244759c4ad2") : configmap "openstack-scripts" not found Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.432837 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.551607 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f70688-a1f8-4465-821b-48f5381ff96c" path="/var/lib/kubelet/pods/17f70688-a1f8-4465-821b-48f5381ff96c/volumes" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.551962 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" path="/var/lib/kubelet/pods/4981d9a9-898f-49ff-809d-58c7ca3bd2a3/volumes" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.552514 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" path="/var/lib/kubelet/pods/d3b1cc40-9985-45d8-bb06-0676ff188c6c/volumes" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.553657 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7eb0179-b292-4a09-a07d-3d9bfe7978f3" path="/var/lib/kubelet/pods/f7eb0179-b292-4a09-a07d-3d9bfe7978f3/volumes" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.556054 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-config-data\") pod \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.556086 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-fernet-keys\") pod \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.556103 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-scripts\") pod \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.556265 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-credential-keys\") pod \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.556307 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsj4z\" (UniqueName: \"kubernetes.io/projected/1ee84530-efd7-4d83-9aa2-fb9b8b178496-kube-api-access-jsj4z\") pod \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\" (UID: \"1ee84530-efd7-4d83-9aa2-fb9b8b178496\") " Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.560887 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1ee84530-efd7-4d83-9aa2-fb9b8b178496" (UID: "1ee84530-efd7-4d83-9aa2-fb9b8b178496"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.561242 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1ee84530-efd7-4d83-9aa2-fb9b8b178496" (UID: "1ee84530-efd7-4d83-9aa2-fb9b8b178496"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.561307 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee84530-efd7-4d83-9aa2-fb9b8b178496-kube-api-access-jsj4z" (OuterVolumeSpecName: "kube-api-access-jsj4z") pod "1ee84530-efd7-4d83-9aa2-fb9b8b178496" (UID: "1ee84530-efd7-4d83-9aa2-fb9b8b178496"). InnerVolumeSpecName "kube-api-access-jsj4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.561726 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-scripts" (OuterVolumeSpecName: "scripts") pod "1ee84530-efd7-4d83-9aa2-fb9b8b178496" (UID: "1ee84530-efd7-4d83-9aa2-fb9b8b178496"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.575968 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-config-data" (OuterVolumeSpecName: "config-data") pod "1ee84530-efd7-4d83-9aa2-fb9b8b178496" (UID: "1ee84530-efd7-4d83-9aa2-fb9b8b178496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.610641 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-db-create-hffnv"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.618566 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-db-create-hffnv"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.627997 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-f734-account-create-update-wkhs4"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.633501 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbicanf734-account-delete-g264z"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.639572 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-f734-account-create-update-wkhs4"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.644271 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbicanf734-account-delete-g264z"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.657783 4732 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.657831 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsj4z\" (UniqueName: \"kubernetes.io/projected/1ee84530-efd7-4d83-9aa2-fb9b8b178496-kube-api-access-jsj4z\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.657847 4732 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.657859 4732 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.657870 4732 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee84530-efd7-4d83-9aa2-fb9b8b178496-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.781523 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"c0d4fa62-a33c-4ab2-a446-697994c1541e","Type":"ContainerDied","Data":"d82037df40928405007453003da8d4a3924f1437dbd107f02f6b845354809fb0"} Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.781592 4732 scope.go:117] "RemoveContainer" containerID="3bead2528c2ca5acd2f850812b4f5c0dc70486b1bf295a4a4a607b3a99c2b59e" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.781545 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.782820 4732 generic.go:334] "Generic (PLEG): container finished" podID="1ee84530-efd7-4d83-9aa2-fb9b8b178496" containerID="2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932" exitCode=0 Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.782853 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" event={"ID":"1ee84530-efd7-4d83-9aa2-fb9b8b178496","Type":"ContainerDied","Data":"2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932"} Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.782996 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" event={"ID":"1ee84530-efd7-4d83-9aa2-fb9b8b178496","Type":"ContainerDied","Data":"dab5aaa05d738eb5bfa3b09e12be7ced9a61a97b3de7389937699c76857d4ec7"} Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.782873 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-7959cb4f8b-4vxsg" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.803573 4732 scope.go:117] "RemoveContainer" containerID="2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.825742 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/memcached-0"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.828149 4732 scope.go:117] "RemoveContainer" containerID="2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932" Jan 31 09:23:28 crc kubenswrapper[4732]: E0131 09:23:28.828627 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932\": container with ID starting with 2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932 not found: ID does not exist" containerID="2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.828966 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932"} err="failed to get container status \"2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932\": rpc error: code = NotFound desc = could not find container \"2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932\": container with ID starting with 2242e86748f2123e7c2371fd3207440c5237f2bf59a11e74432d9fe62c372932 not found: ID does not exist" Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.830713 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/memcached-0"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.838519 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-7959cb4f8b-4vxsg"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.844762 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-7959cb4f8b-4vxsg"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.940150 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc"] Jan 31 09:23:28 crc kubenswrapper[4732]: I0131 09:23:28.940458 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" podUID="17f93e90-1e9a-439c-a130-487ebf54ad10" containerName="manager" containerID="cri-o://dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697" gracePeriod=10 Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.218504 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-zhkcf"] Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.219197 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-index-zhkcf" podUID="24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf" containerName="registry-server" containerID="cri-o://27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f" gracePeriod=30 Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.261551 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv"] Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.268928 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/0e62aba39cc8fdb97a45279afc017a13ec54a5c105520c9343a09281d2km4rv"] Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.400251 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.425421 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570531 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9\") pod \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570577 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-pod-info\") pod \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570600 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-erlang-cookie-secret\") pod \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570683 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxmtn\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-kube-api-access-xxmtn\") pod \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570702 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-confd\") pod \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570720 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-webhook-cert\") pod \"17f93e90-1e9a-439c-a130-487ebf54ad10\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570749 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-erlang-cookie\") pod \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570788 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-plugins-conf\") pod \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570805 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-apiservice-cert\") pod \"17f93e90-1e9a-439c-a130-487ebf54ad10\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570827 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-plugins\") pod \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.570851 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28txj\" (UniqueName: \"kubernetes.io/projected/17f93e90-1e9a-439c-a130-487ebf54ad10-kube-api-access-28txj\") pod \"17f93e90-1e9a-439c-a130-487ebf54ad10\" (UID: \"17f93e90-1e9a-439c-a130-487ebf54ad10\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.571445 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" (UID: "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.572449 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" (UID: "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.572458 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" (UID: "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.577204 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" (UID: "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.577218 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "17f93e90-1e9a-439c-a130-487ebf54ad10" (UID: "17f93e90-1e9a-439c-a130-487ebf54ad10"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.577260 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-kube-api-access-xxmtn" (OuterVolumeSpecName: "kube-api-access-xxmtn") pod "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" (UID: "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd"). InnerVolumeSpecName "kube-api-access-xxmtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.577297 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "17f93e90-1e9a-439c-a130-487ebf54ad10" (UID: "17f93e90-1e9a-439c-a130-487ebf54ad10"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.577317 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f93e90-1e9a-439c-a130-487ebf54ad10-kube-api-access-28txj" (OuterVolumeSpecName: "kube-api-access-28txj") pod "17f93e90-1e9a-439c-a130-487ebf54ad10" (UID: "17f93e90-1e9a-439c-a130-487ebf54ad10"). InnerVolumeSpecName "kube-api-access-28txj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.578171 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-pod-info" (OuterVolumeSpecName: "pod-info") pod "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" (UID: "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: E0131 09:23:29.578302 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9 podName:dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd nodeName:}" failed. No retries permitted until 2026-01-31 09:23:30.078287858 +0000 UTC m=+1348.384164062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "persistence" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9") pod "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" (UID: "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.579624 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.632513 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" (UID: "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.672630 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59577\" (UniqueName: \"kubernetes.io/projected/24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf-kube-api-access-59577\") pod \"24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf\" (UID: \"24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf\") " Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.673385 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28txj\" (UniqueName: \"kubernetes.io/projected/17f93e90-1e9a-439c-a130-487ebf54ad10-kube-api-access-28txj\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.674135 4732 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-pod-info\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.674156 4732 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.674168 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxmtn\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-kube-api-access-xxmtn\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.674177 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.674186 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.674195 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.674204 4732 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/17f93e90-1e9a-439c-a130-487ebf54ad10-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.674212 4732 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.674219 4732 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.676585 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf-kube-api-access-59577" (OuterVolumeSpecName: "kube-api-access-59577") pod "24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf" (UID: "24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf"). InnerVolumeSpecName "kube-api-access-59577". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.777161 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59577\" (UniqueName: \"kubernetes.io/projected/24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf-kube-api-access-59577\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.807007 4732 generic.go:334] "Generic (PLEG): container finished" podID="17f93e90-1e9a-439c-a130-487ebf54ad10" containerID="dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697" exitCode=0 Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.807120 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" event={"ID":"17f93e90-1e9a-439c-a130-487ebf54ad10","Type":"ContainerDied","Data":"dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697"} Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.807882 4732 scope.go:117] "RemoveContainer" containerID="dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.807900 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.808719 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc" event={"ID":"17f93e90-1e9a-439c-a130-487ebf54ad10","Type":"ContainerDied","Data":"3190f0d353720c75142ebd0bfb06e439e4a2407802386eda349902b5c0a59659"} Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.810231 4732 generic.go:334] "Generic (PLEG): container finished" podID="24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf" containerID="27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f" exitCode=0 Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.810285 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-zhkcf" event={"ID":"24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf","Type":"ContainerDied","Data":"27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f"} Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.810305 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-zhkcf" event={"ID":"24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf","Type":"ContainerDied","Data":"8514ba99bedbc8f5d369f906a000fdaa79e2f95c8cdc60f9e5782dca2dcdc8ab"} Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.810368 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-zhkcf" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.813481 4732 generic.go:334] "Generic (PLEG): container finished" podID="dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" containerID="5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d" exitCode=0 Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.813553 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd","Type":"ContainerDied","Data":"5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d"} Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.813570 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.813582 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd","Type":"ContainerDied","Data":"c0069b5700dea7f3177d96246d005bd742c680b2404c8813992998fda3388237"} Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.852888 4732 scope.go:117] "RemoveContainer" containerID="dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697" Jan 31 09:23:29 crc kubenswrapper[4732]: E0131 09:23:29.854376 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697\": container with ID starting with dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697 not found: ID does not exist" containerID="dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.854506 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697"} err="failed to get container status \"dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697\": rpc error: code = NotFound desc = could not find container \"dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697\": container with ID starting with dc0300becefa39e137a97b82074b1aa44c6492acb1b746336056c96e36e23697 not found: ID does not exist" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.854602 4732 scope.go:117] "RemoveContainer" containerID="27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.855813 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc"] Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.867010 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6786c8c4f8-x5zqc"] Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.869249 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-zhkcf"] Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.873098 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-index-zhkcf"] Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.875734 4732 scope.go:117] "RemoveContainer" containerID="27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f" Jan 31 09:23:29 crc kubenswrapper[4732]: E0131 09:23:29.878375 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f\": container with ID starting with 27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f not found: ID does not exist" containerID="27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.878444 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f"} err="failed to get container status \"27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f\": rpc error: code = NotFound desc = could not find container \"27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f\": container with ID starting with 27e7fcfe2e3b7ddaff4e021897bbeb8cb28014e689190c68e97e8b5269f0181f not found: ID does not exist" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.878479 4732 scope.go:117] "RemoveContainer" containerID="5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.919400 4732 scope.go:117] "RemoveContainer" containerID="cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.945188 4732 scope.go:117] "RemoveContainer" containerID="5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d" Jan 31 09:23:29 crc kubenswrapper[4732]: E0131 09:23:29.945623 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d\": container with ID starting with 5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d not found: ID does not exist" containerID="5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.945658 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d"} err="failed to get container status \"5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d\": rpc error: code = NotFound desc = could not find container \"5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d\": container with ID starting with 5671dbb66e09093ec1e8398957e4f56f878e7982d9afc91bf35fc2570b73092d not found: ID does not exist" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.945736 4732 scope.go:117] "RemoveContainer" containerID="cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb" Jan 31 09:23:29 crc kubenswrapper[4732]: E0131 09:23:29.947060 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb\": container with ID starting with cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb not found: ID does not exist" containerID="cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb" Jan 31 09:23:29 crc kubenswrapper[4732]: I0131 09:23:29.947086 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb"} err="failed to get container status \"cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb\": rpc error: code = NotFound desc = could not find container \"cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb\": container with ID starting with cb8d487c03d78ac3a07f4803e072d42f9685a260011084fe0b63e98b41a409fb not found: ID does not exist" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.084680 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9\") pod \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\" (UID: \"dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.089676 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-db-create-qtddl"] Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.100633 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-db-create-qtddl"] Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.109960 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9" (OuterVolumeSpecName: "persistence") pod "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" (UID: "dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd"). InnerVolumeSpecName "pvc-017be409-2d02-48b0-bb67-3403111bd6b9". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.116767 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone5b86-account-delete-qkt4p"] Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.116960 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" podUID="1654407d-7276-4839-839d-1244759c4ad2" containerName="mariadb-account-delete" containerID="cri-o://79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069" gracePeriod=30 Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.130014 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk"] Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.132245 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.139463 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-5b86-account-create-update-vh2kk"] Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.142424 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/openstack-galera-0" podUID="616eedfe-830a-4ca8-9c42-a2cfd9352312" containerName="galera" containerID="cri-o://8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511" gracePeriod=26 Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.179844 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.184898 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.190562 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-017be409-2d02-48b0-bb67-3403111bd6b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9\") on node \"crc\" " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.214754 4732 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.214909 4732 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-017be409-2d02-48b0-bb67-3403111bd6b9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9") on node "crc" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.291796 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-operator-scripts\") pod \"0682a582-79d6-4286-9a43-e4a258dde73f\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.292098 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-default\") pod \"0682a582-79d6-4286-9a43-e4a258dde73f\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.292231 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0682a582-79d6-4286-9a43-e4a258dde73f\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.292387 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xg64\" (UniqueName: \"kubernetes.io/projected/0682a582-79d6-4286-9a43-e4a258dde73f-kube-api-access-9xg64\") pod \"0682a582-79d6-4286-9a43-e4a258dde73f\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.292435 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0682a582-79d6-4286-9a43-e4a258dde73f" (UID: "0682a582-79d6-4286-9a43-e4a258dde73f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.292499 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "0682a582-79d6-4286-9a43-e4a258dde73f" (UID: "0682a582-79d6-4286-9a43-e4a258dde73f"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.292542 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-kolla-config\") pod \"0682a582-79d6-4286-9a43-e4a258dde73f\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.293113 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "0682a582-79d6-4286-9a43-e4a258dde73f" (UID: "0682a582-79d6-4286-9a43-e4a258dde73f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.293247 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-generated\") pod \"0682a582-79d6-4286-9a43-e4a258dde73f\" (UID: \"0682a582-79d6-4286-9a43-e4a258dde73f\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.293526 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "0682a582-79d6-4286-9a43-e4a258dde73f" (UID: "0682a582-79d6-4286-9a43-e4a258dde73f"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.293728 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.293807 4732 reconciler_common.go:293] "Volume detached for volume \"pvc-017be409-2d02-48b0-bb67-3403111bd6b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-017be409-2d02-48b0-bb67-3403111bd6b9\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.293867 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.293926 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.293980 4732 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0682a582-79d6-4286-9a43-e4a258dde73f-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.295380 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0682a582-79d6-4286-9a43-e4a258dde73f-kube-api-access-9xg64" (OuterVolumeSpecName: "kube-api-access-9xg64") pod "0682a582-79d6-4286-9a43-e4a258dde73f" (UID: "0682a582-79d6-4286-9a43-e4a258dde73f"). InnerVolumeSpecName "kube-api-access-9xg64". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.299302 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "0682a582-79d6-4286-9a43-e4a258dde73f" (UID: "0682a582-79d6-4286-9a43-e4a258dde73f"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.395409 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xg64\" (UniqueName: \"kubernetes.io/projected/0682a582-79d6-4286-9a43-e4a258dde73f-kube-api-access-9xg64\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.395753 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 31 09:23:30 crc kubenswrapper[4732]: E0131 09:23:30.395541 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:30 crc kubenswrapper[4732]: E0131 09:23:30.395986 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts podName:1654407d-7276-4839-839d-1244759c4ad2 nodeName:}" failed. No retries permitted until 2026-01-31 09:23:34.395959504 +0000 UTC m=+1352.701835708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts") pod "keystone5b86-account-delete-qkt4p" (UID: "1654407d-7276-4839-839d-1244759c4ad2") : configmap "openstack-scripts" not found Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.408017 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.497268 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.548959 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f93e90-1e9a-439c-a130-487ebf54ad10" path="/var/lib/kubelet/pods/17f93e90-1e9a-439c-a130-487ebf54ad10/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.549556 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee84530-efd7-4d83-9aa2-fb9b8b178496" path="/var/lib/kubelet/pods/1ee84530-efd7-4d83-9aa2-fb9b8b178496/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.550078 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf" path="/var/lib/kubelet/pods/24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.550641 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d54d8c7-230a-4831-97a8-d17aef7fa6eb" path="/var/lib/kubelet/pods/6d54d8c7-230a-4831-97a8-d17aef7fa6eb/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.551839 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92cd61e3-285c-42b8-b382-b8dde5e934b8" path="/var/lib/kubelet/pods/92cd61e3-285c-42b8-b382-b8dde5e934b8/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.552390 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8db73f4-a3fd-4276-87eb-69db3df2adb6" path="/var/lib/kubelet/pods/a8db73f4-a3fd-4276-87eb-69db3df2adb6/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.553145 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf9e4683-b288-4b28-9b9b-504461c55a4e" path="/var/lib/kubelet/pods/bf9e4683-b288-4b28-9b9b-504461c55a4e/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.554216 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d4fa62-a33c-4ab2-a446-697994c1541e" path="/var/lib/kubelet/pods/c0d4fa62-a33c-4ab2-a446-697994c1541e/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.554655 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86b412a-c376-48cd-b724-77e5fb6c9347" path="/var/lib/kubelet/pods/c86b412a-c376-48cd-b724-77e5fb6c9347/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.555606 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" path="/var/lib/kubelet/pods/dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.556164 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d550c8-968a-4962-9e23-c0c22911913d" path="/var/lib/kubelet/pods/e5d550c8-968a-4962-9e23-c0c22911913d/volumes" Jan 31 09:23:30 crc kubenswrapper[4732]: E0131 09:23:30.661072 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511 is running failed: container process not found" containerID="8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 31 09:23:30 crc kubenswrapper[4732]: E0131 09:23:30.661426 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511 is running failed: container process not found" containerID="8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 31 09:23:30 crc kubenswrapper[4732]: E0131 09:23:30.661730 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511 is running failed: container process not found" containerID="8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Jan 31 09:23:30 crc kubenswrapper[4732]: E0131 09:23:30.661791 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511 is running failed: container process not found" probeType="Readiness" pod="swift-kuttl-tests/openstack-galera-0" podUID="616eedfe-830a-4ca8-9c42-a2cfd9352312" containerName="galera" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.804277 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.831862 4732 generic.go:334] "Generic (PLEG): container finished" podID="616eedfe-830a-4ca8-9c42-a2cfd9352312" containerID="8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511" exitCode=0 Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.831896 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.831916 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"616eedfe-830a-4ca8-9c42-a2cfd9352312","Type":"ContainerDied","Data":"8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511"} Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.831963 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"616eedfe-830a-4ca8-9c42-a2cfd9352312","Type":"ContainerDied","Data":"2da8aa3ed8596e5beb1462be0a364f515a0e7e35f648a1fd6f1e41dd41f084dd"} Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.831983 4732 scope.go:117] "RemoveContainer" containerID="8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.834485 4732 generic.go:334] "Generic (PLEG): container finished" podID="0682a582-79d6-4286-9a43-e4a258dde73f" containerID="ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e" exitCode=0 Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.834542 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.834571 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"0682a582-79d6-4286-9a43-e4a258dde73f","Type":"ContainerDied","Data":"ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e"} Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.834600 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"0682a582-79d6-4286-9a43-e4a258dde73f","Type":"ContainerDied","Data":"e0837550f45b62d919804bb06cc6ab1d6a363eb752c3707a21645f710833b1bc"} Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.860195 4732 scope.go:117] "RemoveContainer" containerID="492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.862222 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.868748 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.903785 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-default\") pod \"616eedfe-830a-4ca8-9c42-a2cfd9352312\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.903852 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-kolla-config\") pod \"616eedfe-830a-4ca8-9c42-a2cfd9352312\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.903903 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-generated\") pod \"616eedfe-830a-4ca8-9c42-a2cfd9352312\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.903927 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-operator-scripts\") pod \"616eedfe-830a-4ca8-9c42-a2cfd9352312\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.903971 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"616eedfe-830a-4ca8-9c42-a2cfd9352312\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.904014 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnkrx\" (UniqueName: \"kubernetes.io/projected/616eedfe-830a-4ca8-9c42-a2cfd9352312-kube-api-access-jnkrx\") pod \"616eedfe-830a-4ca8-9c42-a2cfd9352312\" (UID: \"616eedfe-830a-4ca8-9c42-a2cfd9352312\") " Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.904517 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "616eedfe-830a-4ca8-9c42-a2cfd9352312" (UID: "616eedfe-830a-4ca8-9c42-a2cfd9352312"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.904880 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "616eedfe-830a-4ca8-9c42-a2cfd9352312" (UID: "616eedfe-830a-4ca8-9c42-a2cfd9352312"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.904963 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "616eedfe-830a-4ca8-9c42-a2cfd9352312" (UID: "616eedfe-830a-4ca8-9c42-a2cfd9352312"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.905095 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "616eedfe-830a-4ca8-9c42-a2cfd9352312" (UID: "616eedfe-830a-4ca8-9c42-a2cfd9352312"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.908713 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/616eedfe-830a-4ca8-9c42-a2cfd9352312-kube-api-access-jnkrx" (OuterVolumeSpecName: "kube-api-access-jnkrx") pod "616eedfe-830a-4ca8-9c42-a2cfd9352312" (UID: "616eedfe-830a-4ca8-9c42-a2cfd9352312"). InnerVolumeSpecName "kube-api-access-jnkrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.914769 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "616eedfe-830a-4ca8-9c42-a2cfd9352312" (UID: "616eedfe-830a-4ca8-9c42-a2cfd9352312"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.915351 4732 scope.go:117] "RemoveContainer" containerID="8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511" Jan 31 09:23:30 crc kubenswrapper[4732]: E0131 09:23:30.916202 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511\": container with ID starting with 8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511 not found: ID does not exist" containerID="8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.916243 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511"} err="failed to get container status \"8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511\": rpc error: code = NotFound desc = could not find container \"8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511\": container with ID starting with 8526f310ead3f5a33ed7787edf3905aaec728d5abbfb09eb99dc036ea2fa4511 not found: ID does not exist" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.916271 4732 scope.go:117] "RemoveContainer" containerID="492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4" Jan 31 09:23:30 crc kubenswrapper[4732]: E0131 09:23:30.916778 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4\": container with ID starting with 492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4 not found: ID does not exist" containerID="492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.916827 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4"} err="failed to get container status \"492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4\": rpc error: code = NotFound desc = could not find container \"492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4\": container with ID starting with 492d45e71fac2738ff1d26d6c415b6dfd2b83adc568c28c60ba8fc993992d5c4 not found: ID does not exist" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.917036 4732 scope.go:117] "RemoveContainer" containerID="ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e" Jan 31 09:23:30 crc kubenswrapper[4732]: I0131 09:23:30.981675 4732 scope.go:117] "RemoveContainer" containerID="ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.004010 4732 scope.go:117] "RemoveContainer" containerID="ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e" Jan 31 09:23:31 crc kubenswrapper[4732]: E0131 09:23:31.004536 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e\": container with ID starting with ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e not found: ID does not exist" containerID="ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.004575 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e"} err="failed to get container status \"ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e\": rpc error: code = NotFound desc = could not find container \"ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e\": container with ID starting with ee4de6ba1b0a29ba82881c9260150c9bf01cda625efde0b6a945316636e5517e not found: ID does not exist" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.004601 4732 scope.go:117] "RemoveContainer" containerID="ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279" Jan 31 09:23:31 crc kubenswrapper[4732]: E0131 09:23:31.004958 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279\": container with ID starting with ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279 not found: ID does not exist" containerID="ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.004991 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279"} err="failed to get container status \"ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279\": rpc error: code = NotFound desc = could not find container \"ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279\": container with ID starting with ce97edc32bdb9fb197af077948d670e3a55f9f7d022545ce9a9c32c0c8222279 not found: ID does not exist" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.005061 4732 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.005090 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.005100 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.005131 4732 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.005141 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnkrx\" (UniqueName: \"kubernetes.io/projected/616eedfe-830a-4ca8-9c42-a2cfd9352312-kube-api-access-jnkrx\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.005149 4732 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/616eedfe-830a-4ca8-9c42-a2cfd9352312-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.017390 4732 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.106918 4732 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.161733 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Jan 31 09:23:31 crc kubenswrapper[4732]: I0131 09:23:31.165425 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.096809 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c"] Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.097020 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" podUID="4ef2e0aa-0782-4814-9d5f-9a6a32fb121b" containerName="manager" containerID="cri-o://613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1" gracePeriod=10 Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.314238 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-ps2mw"] Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.314718 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/barbican-operator-index-ps2mw" podUID="a6b3a350-8b41-44a0-a6b4-b957947e1df6" containerName="registry-server" containerID="cri-o://6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192" gracePeriod=30 Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.333618 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln"] Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.342373 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/55c7a49163ba348c10e2be21119f4ca8799dffa34873699cfe8f8b6d7bcwsln"] Jan 31 09:23:32 crc kubenswrapper[4732]: E0131 09:23:32.384400 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192 is running failed: container process not found" containerID="6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:23:32 crc kubenswrapper[4732]: E0131 09:23:32.384846 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192 is running failed: container process not found" containerID="6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:23:32 crc kubenswrapper[4732]: E0131 09:23:32.385252 4732 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192 is running failed: container process not found" containerID="6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 09:23:32 crc kubenswrapper[4732]: E0131 09:23:32.385307 4732 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192 is running failed: container process not found" probeType="Readiness" pod="openstack-operators/barbican-operator-index-ps2mw" podUID="a6b3a350-8b41-44a0-a6b4-b957947e1df6" containerName="registry-server" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.553978 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0682a582-79d6-4286-9a43-e4a258dde73f" path="/var/lib/kubelet/pods/0682a582-79d6-4286-9a43-e4a258dde73f/volumes" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.554760 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9d87a5-c953-483d-8183-0a0b8d4abac9" path="/var/lib/kubelet/pods/5a9d87a5-c953-483d-8183-0a0b8d4abac9/volumes" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.555589 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="616eedfe-830a-4ca8-9c42-a2cfd9352312" path="/var/lib/kubelet/pods/616eedfe-830a-4ca8-9c42-a2cfd9352312/volumes" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.568176 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.730876 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-webhook-cert\") pod \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.731013 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwr5w\" (UniqueName: \"kubernetes.io/projected/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-kube-api-access-vwr5w\") pod \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.731055 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-apiservice-cert\") pod \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\" (UID: \"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b\") " Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.738197 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-kube-api-access-vwr5w" (OuterVolumeSpecName: "kube-api-access-vwr5w") pod "4ef2e0aa-0782-4814-9d5f-9a6a32fb121b" (UID: "4ef2e0aa-0782-4814-9d5f-9a6a32fb121b"). InnerVolumeSpecName "kube-api-access-vwr5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.738594 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "4ef2e0aa-0782-4814-9d5f-9a6a32fb121b" (UID: "4ef2e0aa-0782-4814-9d5f-9a6a32fb121b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.743971 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "4ef2e0aa-0782-4814-9d5f-9a6a32fb121b" (UID: "4ef2e0aa-0782-4814-9d5f-9a6a32fb121b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.799782 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.832449 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwr5w\" (UniqueName: \"kubernetes.io/projected/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-kube-api-access-vwr5w\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.832486 4732 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.832498 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.856813 4732 generic.go:334] "Generic (PLEG): container finished" podID="4ef2e0aa-0782-4814-9d5f-9a6a32fb121b" containerID="613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1" exitCode=0 Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.856861 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" event={"ID":"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b","Type":"ContainerDied","Data":"613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1"} Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.856912 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" event={"ID":"4ef2e0aa-0782-4814-9d5f-9a6a32fb121b","Type":"ContainerDied","Data":"f54629922c72c3da7a2c23ec9c364e1132ebace0c630660cc9bfe34f06a31d4f"} Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.856932 4732 scope.go:117] "RemoveContainer" containerID="613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.857477 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.859490 4732 generic.go:334] "Generic (PLEG): container finished" podID="a6b3a350-8b41-44a0-a6b4-b957947e1df6" containerID="6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192" exitCode=0 Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.859519 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-ps2mw" event={"ID":"a6b3a350-8b41-44a0-a6b4-b957947e1df6","Type":"ContainerDied","Data":"6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192"} Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.859546 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-ps2mw" event={"ID":"a6b3a350-8b41-44a0-a6b4-b957947e1df6","Type":"ContainerDied","Data":"a2b2ec2462bff27c657ef1956671a25bb4c7e593d5773eb486359c1043f19888"} Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.859601 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-ps2mw" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.877220 4732 scope.go:117] "RemoveContainer" containerID="613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1" Jan 31 09:23:32 crc kubenswrapper[4732]: E0131 09:23:32.878027 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1\": container with ID starting with 613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1 not found: ID does not exist" containerID="613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.878137 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1"} err="failed to get container status \"613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1\": rpc error: code = NotFound desc = could not find container \"613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1\": container with ID starting with 613e26fde16172e6df7958455ae9310666a808f14ae1a1a48a42003388cc5dc1 not found: ID does not exist" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.878226 4732 scope.go:117] "RemoveContainer" containerID="6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.887278 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c"] Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.891654 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-665b569d9f-rjs9c"] Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.893994 4732 scope.go:117] "RemoveContainer" containerID="6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192" Jan 31 09:23:32 crc kubenswrapper[4732]: E0131 09:23:32.894372 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192\": container with ID starting with 6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192 not found: ID does not exist" containerID="6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.894414 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192"} err="failed to get container status \"6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192\": rpc error: code = NotFound desc = could not find container \"6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192\": container with ID starting with 6cf31f543f940bfb1886623400213cf6174f8dfa77f1c5bc43119a36a85bd192 not found: ID does not exist" Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.933509 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj79r\" (UniqueName: \"kubernetes.io/projected/a6b3a350-8b41-44a0-a6b4-b957947e1df6-kube-api-access-cj79r\") pod \"a6b3a350-8b41-44a0-a6b4-b957947e1df6\" (UID: \"a6b3a350-8b41-44a0-a6b4-b957947e1df6\") " Jan 31 09:23:32 crc kubenswrapper[4732]: I0131 09:23:32.936366 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b3a350-8b41-44a0-a6b4-b957947e1df6-kube-api-access-cj79r" (OuterVolumeSpecName: "kube-api-access-cj79r") pod "a6b3a350-8b41-44a0-a6b4-b957947e1df6" (UID: "a6b3a350-8b41-44a0-a6b4-b957947e1df6"). InnerVolumeSpecName "kube-api-access-cj79r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:33 crc kubenswrapper[4732]: I0131 09:23:33.036134 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj79r\" (UniqueName: \"kubernetes.io/projected/a6b3a350-8b41-44a0-a6b4-b957947e1df6-kube-api-access-cj79r\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:33 crc kubenswrapper[4732]: I0131 09:23:33.233727 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/barbican-operator-index-ps2mw"] Jan 31 09:23:33 crc kubenswrapper[4732]: I0131 09:23:33.238599 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/barbican-operator-index-ps2mw"] Jan 31 09:23:34 crc kubenswrapper[4732]: E0131 09:23:34.453469 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:34 crc kubenswrapper[4732]: E0131 09:23:34.453552 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts podName:1654407d-7276-4839-839d-1244759c4ad2 nodeName:}" failed. No retries permitted until 2026-01-31 09:23:42.453537958 +0000 UTC m=+1360.759414162 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts") pod "keystone5b86-account-delete-qkt4p" (UID: "1654407d-7276-4839-839d-1244759c4ad2") : configmap "openstack-scripts" not found Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.551629 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef2e0aa-0782-4814-9d5f-9a6a32fb121b" path="/var/lib/kubelet/pods/4ef2e0aa-0782-4814-9d5f-9a6a32fb121b/volumes" Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.552376 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b3a350-8b41-44a0-a6b4-b957947e1df6" path="/var/lib/kubelet/pods/a6b3a350-8b41-44a0-a6b4-b957947e1df6/volumes" Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.699742 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq"] Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.699931 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" podUID="241eae26-3908-40e0-af9c-59b54a6ab1a0" containerName="manager" containerID="cri-o://33284c815fdcf867d64c94279b754de6fc3557337b45dc375a7c56112e2acde1" gracePeriod=10 Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.877803 4732 generic.go:334] "Generic (PLEG): container finished" podID="241eae26-3908-40e0-af9c-59b54a6ab1a0" containerID="33284c815fdcf867d64c94279b754de6fc3557337b45dc375a7c56112e2acde1" exitCode=0 Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.877978 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" event={"ID":"241eae26-3908-40e0-af9c-59b54a6ab1a0","Type":"ContainerDied","Data":"33284c815fdcf867d64c94279b754de6fc3557337b45dc375a7c56112e2acde1"} Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.915182 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-hbpcb"] Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.915366 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-hbpcb" podUID="e617e130-c338-40d1-9a5c-83e925a4e6ed" containerName="registry-server" containerID="cri-o://c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d" gracePeriod=30 Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.940654 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj"] Jan 31 09:23:34 crc kubenswrapper[4732]: I0131 09:23:34.945618 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvpfwj"] Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.121177 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.263419 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7m6t\" (UniqueName: \"kubernetes.io/projected/241eae26-3908-40e0-af9c-59b54a6ab1a0-kube-api-access-r7m6t\") pod \"241eae26-3908-40e0-af9c-59b54a6ab1a0\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.263551 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-webhook-cert\") pod \"241eae26-3908-40e0-af9c-59b54a6ab1a0\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.263581 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-apiservice-cert\") pod \"241eae26-3908-40e0-af9c-59b54a6ab1a0\" (UID: \"241eae26-3908-40e0-af9c-59b54a6ab1a0\") " Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.271489 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241eae26-3908-40e0-af9c-59b54a6ab1a0-kube-api-access-r7m6t" (OuterVolumeSpecName: "kube-api-access-r7m6t") pod "241eae26-3908-40e0-af9c-59b54a6ab1a0" (UID: "241eae26-3908-40e0-af9c-59b54a6ab1a0"). InnerVolumeSpecName "kube-api-access-r7m6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.272226 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "241eae26-3908-40e0-af9c-59b54a6ab1a0" (UID: "241eae26-3908-40e0-af9c-59b54a6ab1a0"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.278226 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "241eae26-3908-40e0-af9c-59b54a6ab1a0" (UID: "241eae26-3908-40e0-af9c-59b54a6ab1a0"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.300227 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.368205 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.368233 4732 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/241eae26-3908-40e0-af9c-59b54a6ab1a0-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.368245 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7m6t\" (UniqueName: \"kubernetes.io/projected/241eae26-3908-40e0-af9c-59b54a6ab1a0-kube-api-access-r7m6t\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.469343 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnrxr\" (UniqueName: \"kubernetes.io/projected/e617e130-c338-40d1-9a5c-83e925a4e6ed-kube-api-access-cnrxr\") pod \"e617e130-c338-40d1-9a5c-83e925a4e6ed\" (UID: \"e617e130-c338-40d1-9a5c-83e925a4e6ed\") " Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.472352 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e617e130-c338-40d1-9a5c-83e925a4e6ed-kube-api-access-cnrxr" (OuterVolumeSpecName: "kube-api-access-cnrxr") pod "e617e130-c338-40d1-9a5c-83e925a4e6ed" (UID: "e617e130-c338-40d1-9a5c-83e925a4e6ed"). InnerVolumeSpecName "kube-api-access-cnrxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.571270 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnrxr\" (UniqueName: \"kubernetes.io/projected/e617e130-c338-40d1-9a5c-83e925a4e6ed-kube-api-access-cnrxr\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.888101 4732 generic.go:334] "Generic (PLEG): container finished" podID="e617e130-c338-40d1-9a5c-83e925a4e6ed" containerID="c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d" exitCode=0 Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.888151 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-hbpcb" event={"ID":"e617e130-c338-40d1-9a5c-83e925a4e6ed","Type":"ContainerDied","Data":"c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d"} Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.888200 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-hbpcb" event={"ID":"e617e130-c338-40d1-9a5c-83e925a4e6ed","Type":"ContainerDied","Data":"6b9559657e742c2dca3e80a6f30ed1e3ae9bae7e31be4ed1e6ca772141139c64"} Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.888201 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-hbpcb" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.888219 4732 scope.go:117] "RemoveContainer" containerID="c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.890044 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" event={"ID":"241eae26-3908-40e0-af9c-59b54a6ab1a0","Type":"ContainerDied","Data":"0967cf246a03d050c7636d21dd8ba1334433e9343155d38efb7a34b23ed4beb6"} Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.890122 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.936941 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-hbpcb"] Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.937360 4732 scope.go:117] "RemoveContainer" containerID="c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d" Jan 31 09:23:35 crc kubenswrapper[4732]: E0131 09:23:35.937862 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d\": container with ID starting with c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d not found: ID does not exist" containerID="c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.937909 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d"} err="failed to get container status \"c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d\": rpc error: code = NotFound desc = could not find container \"c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d\": container with ID starting with c4ac04f58cff4deda6adf78a5af0c30957b4f3ca033ed38c187c5a81f58e6b7d not found: ID does not exist" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.937938 4732 scope.go:117] "RemoveContainer" containerID="33284c815fdcf867d64c94279b754de6fc3557337b45dc375a7c56112e2acde1" Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.955099 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-hbpcb"] Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.966412 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq"] Jan 31 09:23:35 crc kubenswrapper[4732]: I0131 09:23:35.973751 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-57cdd4758c-lh9cq"] Jan 31 09:23:36 crc kubenswrapper[4732]: I0131 09:23:36.550705 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00bc065b-6932-4b27-bd33-5d8618f8a4f1" path="/var/lib/kubelet/pods/00bc065b-6932-4b27-bd33-5d8618f8a4f1/volumes" Jan 31 09:23:36 crc kubenswrapper[4732]: I0131 09:23:36.551801 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241eae26-3908-40e0-af9c-59b54a6ab1a0" path="/var/lib/kubelet/pods/241eae26-3908-40e0-af9c-59b54a6ab1a0/volumes" Jan 31 09:23:36 crc kubenswrapper[4732]: I0131 09:23:36.552300 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e617e130-c338-40d1-9a5c-83e925a4e6ed" path="/var/lib/kubelet/pods/e617e130-c338-40d1-9a5c-83e925a4e6ed/volumes" Jan 31 09:23:36 crc kubenswrapper[4732]: I0131 09:23:36.918227 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954"] Jan 31 09:23:36 crc kubenswrapper[4732]: I0131 09:23:36.918451 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" podUID="79621d02-e834-4725-8b80-d0444f3b6487" containerName="operator" containerID="cri-o://288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b" gracePeriod=10 Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.238217 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-hbtxq"] Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.238692 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" podUID="9ff1fd2b-a8cb-4050-9a54-3117be6964ce" containerName="registry-server" containerID="cri-o://fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0" gracePeriod=30 Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.266123 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl"] Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.270599 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590sncrl"] Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.387311 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.509172 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztcb8\" (UniqueName: \"kubernetes.io/projected/79621d02-e834-4725-8b80-d0444f3b6487-kube-api-access-ztcb8\") pod \"79621d02-e834-4725-8b80-d0444f3b6487\" (UID: \"79621d02-e834-4725-8b80-d0444f3b6487\") " Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.528950 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79621d02-e834-4725-8b80-d0444f3b6487-kube-api-access-ztcb8" (OuterVolumeSpecName: "kube-api-access-ztcb8") pod "79621d02-e834-4725-8b80-d0444f3b6487" (UID: "79621d02-e834-4725-8b80-d0444f3b6487"). InnerVolumeSpecName "kube-api-access-ztcb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.610511 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztcb8\" (UniqueName: \"kubernetes.io/projected/79621d02-e834-4725-8b80-d0444f3b6487-kube-api-access-ztcb8\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.630709 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.812718 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdnc7\" (UniqueName: \"kubernetes.io/projected/9ff1fd2b-a8cb-4050-9a54-3117be6964ce-kube-api-access-tdnc7\") pod \"9ff1fd2b-a8cb-4050-9a54-3117be6964ce\" (UID: \"9ff1fd2b-a8cb-4050-9a54-3117be6964ce\") " Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.815473 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff1fd2b-a8cb-4050-9a54-3117be6964ce-kube-api-access-tdnc7" (OuterVolumeSpecName: "kube-api-access-tdnc7") pod "9ff1fd2b-a8cb-4050-9a54-3117be6964ce" (UID: "9ff1fd2b-a8cb-4050-9a54-3117be6964ce"). InnerVolumeSpecName "kube-api-access-tdnc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.913856 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdnc7\" (UniqueName: \"kubernetes.io/projected/9ff1fd2b-a8cb-4050-9a54-3117be6964ce-kube-api-access-tdnc7\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.914022 4732 generic.go:334] "Generic (PLEG): container finished" podID="79621d02-e834-4725-8b80-d0444f3b6487" containerID="288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b" exitCode=0 Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.914103 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.914095 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" event={"ID":"79621d02-e834-4725-8b80-d0444f3b6487","Type":"ContainerDied","Data":"288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b"} Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.914142 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954" event={"ID":"79621d02-e834-4725-8b80-d0444f3b6487","Type":"ContainerDied","Data":"2cb30d5ff86683ad564f64402e0e4a2144f56764496f3cb2ead909cfbd0f5de4"} Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.914159 4732 scope.go:117] "RemoveContainer" containerID="288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.924986 4732 generic.go:334] "Generic (PLEG): container finished" podID="9ff1fd2b-a8cb-4050-9a54-3117be6964ce" containerID="fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0" exitCode=0 Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.925026 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" event={"ID":"9ff1fd2b-a8cb-4050-9a54-3117be6964ce","Type":"ContainerDied","Data":"fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0"} Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.925058 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" event={"ID":"9ff1fd2b-a8cb-4050-9a54-3117be6964ce","Type":"ContainerDied","Data":"5f27d35f0d0db33259c32fd8459b7826ea94c3c12989e06661b903a56351ab64"} Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.925054 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-hbtxq" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.946575 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954"] Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.958578 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-2t954"] Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.963771 4732 scope.go:117] "RemoveContainer" containerID="288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b" Jan 31 09:23:37 crc kubenswrapper[4732]: E0131 09:23:37.964350 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b\": container with ID starting with 288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b not found: ID does not exist" containerID="288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.964406 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b"} err="failed to get container status \"288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b\": rpc error: code = NotFound desc = could not find container \"288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b\": container with ID starting with 288dedb19f0678112de5555bcdb82232ae9f8593518e9e520e8756a6802abe9b not found: ID does not exist" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.964440 4732 scope.go:117] "RemoveContainer" containerID="fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.965810 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-hbtxq"] Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.973737 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-hbtxq"] Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.986568 4732 scope.go:117] "RemoveContainer" containerID="fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0" Jan 31 09:23:37 crc kubenswrapper[4732]: E0131 09:23:37.987170 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0\": container with ID starting with fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0 not found: ID does not exist" containerID="fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0" Jan 31 09:23:37 crc kubenswrapper[4732]: I0131 09:23:37.987203 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0"} err="failed to get container status \"fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0\": rpc error: code = NotFound desc = could not find container \"fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0\": container with ID starting with fe9b798d0d4e674d4b68e134ab59cf425b1a6014efda517ffd63f1ccf40b11f0 not found: ID does not exist" Jan 31 09:23:38 crc kubenswrapper[4732]: I0131 09:23:38.551574 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79621d02-e834-4725-8b80-d0444f3b6487" path="/var/lib/kubelet/pods/79621d02-e834-4725-8b80-d0444f3b6487/volumes" Jan 31 09:23:38 crc kubenswrapper[4732]: I0131 09:23:38.552535 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86012593-15ec-4f3c-aaa4-c0522a918019" path="/var/lib/kubelet/pods/86012593-15ec-4f3c-aaa4-c0522a918019/volumes" Jan 31 09:23:38 crc kubenswrapper[4732]: I0131 09:23:38.553303 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff1fd2b-a8cb-4050-9a54-3117be6964ce" path="/var/lib/kubelet/pods/9ff1fd2b-a8cb-4050-9a54-3117be6964ce/volumes" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.264686 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms"] Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.265228 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" podUID="415e013e-ff2f-47b6-a17d-c0ba8f80071a" containerName="manager" containerID="cri-o://b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22" gracePeriod=10 Jan 31 09:23:42 crc kubenswrapper[4732]: E0131 09:23:42.477158 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:42 crc kubenswrapper[4732]: E0131 09:23:42.477477 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts podName:1654407d-7276-4839-839d-1244759c4ad2 nodeName:}" failed. No retries permitted until 2026-01-31 09:23:58.477457971 +0000 UTC m=+1376.783334175 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts") pod "keystone5b86-account-delete-qkt4p" (UID: "1654407d-7276-4839-839d-1244759c4ad2") : configmap "openstack-scripts" not found Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.551077 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-lbfxz"] Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.551305 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-lbfxz" podUID="0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf" containerName="registry-server" containerID="cri-o://910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f" gracePeriod=30 Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.593209 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg"] Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.598965 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576bchhg"] Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.785068 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.883949 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-apiservice-cert\") pod \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.884093 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-webhook-cert\") pod \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.884317 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp7fg\" (UniqueName: \"kubernetes.io/projected/415e013e-ff2f-47b6-a17d-c0ba8f80071a-kube-api-access-vp7fg\") pod \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\" (UID: \"415e013e-ff2f-47b6-a17d-c0ba8f80071a\") " Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.895890 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "415e013e-ff2f-47b6-a17d-c0ba8f80071a" (UID: "415e013e-ff2f-47b6-a17d-c0ba8f80071a"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.895932 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/415e013e-ff2f-47b6-a17d-c0ba8f80071a-kube-api-access-vp7fg" (OuterVolumeSpecName: "kube-api-access-vp7fg") pod "415e013e-ff2f-47b6-a17d-c0ba8f80071a" (UID: "415e013e-ff2f-47b6-a17d-c0ba8f80071a"). InnerVolumeSpecName "kube-api-access-vp7fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.896465 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "415e013e-ff2f-47b6-a17d-c0ba8f80071a" (UID: "415e013e-ff2f-47b6-a17d-c0ba8f80071a"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.924482 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.984194 4732 generic.go:334] "Generic (PLEG): container finished" podID="415e013e-ff2f-47b6-a17d-c0ba8f80071a" containerID="b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22" exitCode=0 Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.984250 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" event={"ID":"415e013e-ff2f-47b6-a17d-c0ba8f80071a","Type":"ContainerDied","Data":"b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22"} Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.984288 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" event={"ID":"415e013e-ff2f-47b6-a17d-c0ba8f80071a","Type":"ContainerDied","Data":"ba8a3b878f56e9627430ea250d8fbd2f4a60f48d4ce391e390485cb7a6931e6c"} Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.984305 4732 scope.go:117] "RemoveContainer" containerID="b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.984407 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.986137 4732 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.986165 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/415e013e-ff2f-47b6-a17d-c0ba8f80071a-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.986178 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp7fg\" (UniqueName: \"kubernetes.io/projected/415e013e-ff2f-47b6-a17d-c0ba8f80071a-kube-api-access-vp7fg\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.988552 4732 generic.go:334] "Generic (PLEG): container finished" podID="0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf" containerID="910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f" exitCode=0 Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.988592 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lbfxz" event={"ID":"0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf","Type":"ContainerDied","Data":"910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f"} Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.988622 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-lbfxz" event={"ID":"0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf","Type":"ContainerDied","Data":"28d499c0e0c3ca32837e0d3a623958320049a25bd1e23f61b0a33ef2f6ce6116"} Jan 31 09:23:42 crc kubenswrapper[4732]: I0131 09:23:42.988624 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-lbfxz" Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.016180 4732 scope.go:117] "RemoveContainer" containerID="b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22" Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.017155 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms"] Jan 31 09:23:43 crc kubenswrapper[4732]: E0131 09:23:43.017204 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22\": container with ID starting with b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22 not found: ID does not exist" containerID="b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22" Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.017280 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22"} err="failed to get container status \"b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22\": rpc error: code = NotFound desc = could not find container \"b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22\": container with ID starting with b7c1c08d282d712f9e0a65331b190ff2ebed938312c6f27e081d2971e6619b22 not found: ID does not exist" Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.017313 4732 scope.go:117] "RemoveContainer" containerID="910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f" Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.022390 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7f868546f6-qkfms"] Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.038100 4732 scope.go:117] "RemoveContainer" containerID="910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f" Jan 31 09:23:43 crc kubenswrapper[4732]: E0131 09:23:43.038473 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f\": container with ID starting with 910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f not found: ID does not exist" containerID="910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f" Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.038497 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f"} err="failed to get container status \"910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f\": rpc error: code = NotFound desc = could not find container \"910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f\": container with ID starting with 910c1830295490a33aa5bee975620416b3ee2fad21058cc6bdfc3765079fd37f not found: ID does not exist" Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.089308 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgj7k\" (UniqueName: \"kubernetes.io/projected/0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf-kube-api-access-jgj7k\") pod \"0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf\" (UID: \"0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf\") " Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.093529 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf-kube-api-access-jgj7k" (OuterVolumeSpecName: "kube-api-access-jgj7k") pod "0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf" (UID: "0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf"). InnerVolumeSpecName "kube-api-access-jgj7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.190729 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgj7k\" (UniqueName: \"kubernetes.io/projected/0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf-kube-api-access-jgj7k\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.319706 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-lbfxz"] Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.332466 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-lbfxz"] Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.907069 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq"] Jan 31 09:23:43 crc kubenswrapper[4732]: I0131 09:23:43.907617 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" podUID="088a3743-a071-4b0e-9cd8-66271eaeafdb" containerName="manager" containerID="cri-o://8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62" gracePeriod=10 Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.095713 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-b545g"] Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.095943 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-b545g" podUID="e502d767-ed18-4540-8d2c-ffd993e4822d" containerName="registry-server" containerID="cri-o://e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7" gracePeriod=30 Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.130943 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664"] Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.135886 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f40gl664"] Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.337375 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.510015 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jb9w\" (UniqueName: \"kubernetes.io/projected/088a3743-a071-4b0e-9cd8-66271eaeafdb-kube-api-access-9jb9w\") pod \"088a3743-a071-4b0e-9cd8-66271eaeafdb\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.510078 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-apiservice-cert\") pod \"088a3743-a071-4b0e-9cd8-66271eaeafdb\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.510156 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-webhook-cert\") pod \"088a3743-a071-4b0e-9cd8-66271eaeafdb\" (UID: \"088a3743-a071-4b0e-9cd8-66271eaeafdb\") " Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.515455 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "088a3743-a071-4b0e-9cd8-66271eaeafdb" (UID: "088a3743-a071-4b0e-9cd8-66271eaeafdb"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.515592 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/088a3743-a071-4b0e-9cd8-66271eaeafdb-kube-api-access-9jb9w" (OuterVolumeSpecName: "kube-api-access-9jb9w") pod "088a3743-a071-4b0e-9cd8-66271eaeafdb" (UID: "088a3743-a071-4b0e-9cd8-66271eaeafdb"). InnerVolumeSpecName "kube-api-access-9jb9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.527432 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "088a3743-a071-4b0e-9cd8-66271eaeafdb" (UID: "088a3743-a071-4b0e-9cd8-66271eaeafdb"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.552964 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf" path="/var/lib/kubelet/pods/0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf/volumes" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.553583 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2996ae7b-aedb-4a67-a98e-b1a466347be0" path="/var/lib/kubelet/pods/2996ae7b-aedb-4a67-a98e-b1a466347be0/volumes" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.554384 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="415e013e-ff2f-47b6-a17d-c0ba8f80071a" path="/var/lib/kubelet/pods/415e013e-ff2f-47b6-a17d-c0ba8f80071a/volumes" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.555722 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a" path="/var/lib/kubelet/pods/f5ae42fb-86af-4a2d-9570-b3be9f3f8f4a/volumes" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.564827 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.611552 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jb9w\" (UniqueName: \"kubernetes.io/projected/088a3743-a071-4b0e-9cd8-66271eaeafdb-kube-api-access-9jb9w\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.611836 4732 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.611925 4732 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/088a3743-a071-4b0e-9cd8-66271eaeafdb-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.712369 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl5c8\" (UniqueName: \"kubernetes.io/projected/e502d767-ed18-4540-8d2c-ffd993e4822d-kube-api-access-jl5c8\") pod \"e502d767-ed18-4540-8d2c-ffd993e4822d\" (UID: \"e502d767-ed18-4540-8d2c-ffd993e4822d\") " Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.718877 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e502d767-ed18-4540-8d2c-ffd993e4822d-kube-api-access-jl5c8" (OuterVolumeSpecName: "kube-api-access-jl5c8") pod "e502d767-ed18-4540-8d2c-ffd993e4822d" (UID: "e502d767-ed18-4540-8d2c-ffd993e4822d"). InnerVolumeSpecName "kube-api-access-jl5c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:23:44 crc kubenswrapper[4732]: I0131 09:23:44.814419 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl5c8\" (UniqueName: \"kubernetes.io/projected/e502d767-ed18-4540-8d2c-ffd993e4822d-kube-api-access-jl5c8\") on node \"crc\" DevicePath \"\"" Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.009411 4732 generic.go:334] "Generic (PLEG): container finished" podID="088a3743-a071-4b0e-9cd8-66271eaeafdb" containerID="8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62" exitCode=0 Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.009493 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" event={"ID":"088a3743-a071-4b0e-9cd8-66271eaeafdb","Type":"ContainerDied","Data":"8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62"} Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.009526 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" event={"ID":"088a3743-a071-4b0e-9cd8-66271eaeafdb","Type":"ContainerDied","Data":"afc983ab3b957756283c86e705571f21643e9da88c0bed95193b050d0c42bb03"} Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.009576 4732 scope.go:117] "RemoveContainer" containerID="8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62" Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.009771 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq" Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.014182 4732 generic.go:334] "Generic (PLEG): container finished" podID="e502d767-ed18-4540-8d2c-ffd993e4822d" containerID="e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7" exitCode=0 Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.014231 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-b545g" event={"ID":"e502d767-ed18-4540-8d2c-ffd993e4822d","Type":"ContainerDied","Data":"e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7"} Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.014263 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-b545g" event={"ID":"e502d767-ed18-4540-8d2c-ffd993e4822d","Type":"ContainerDied","Data":"e336ec36e7b92b98ffa9cd023af81299e0d2a21836b58f3096392afd27981f20"} Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.014275 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-b545g" Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.032306 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq"] Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.039067 4732 scope.go:117] "RemoveContainer" containerID="8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62" Jan 31 09:23:45 crc kubenswrapper[4732]: E0131 09:23:45.039679 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62\": container with ID starting with 8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62 not found: ID does not exist" containerID="8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62" Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.039711 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-665d897fbd-rcqsq"] Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.039716 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62"} err="failed to get container status \"8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62\": rpc error: code = NotFound desc = could not find container \"8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62\": container with ID starting with 8d293f0ee732a310f5bc8795228ce580921819546f87301342b14a14ba38ea62 not found: ID does not exist" Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.039816 4732 scope.go:117] "RemoveContainer" containerID="e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7" Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.061874 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-b545g"] Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.065155 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-b545g"] Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.068202 4732 scope.go:117] "RemoveContainer" containerID="e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7" Jan 31 09:23:45 crc kubenswrapper[4732]: E0131 09:23:45.068648 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7\": container with ID starting with e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7 not found: ID does not exist" containerID="e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7" Jan 31 09:23:45 crc kubenswrapper[4732]: I0131 09:23:45.068704 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7"} err="failed to get container status \"e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7\": rpc error: code = NotFound desc = could not find container \"e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7\": container with ID starting with e0a2d107fb0ab5da52d563b9ed97ddc9f7ce9c51dfb332362a03b85f4c2acfe7 not found: ID does not exist" Jan 31 09:23:46 crc kubenswrapper[4732]: I0131 09:23:46.551137 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="088a3743-a071-4b0e-9cd8-66271eaeafdb" path="/var/lib/kubelet/pods/088a3743-a071-4b0e-9cd8-66271eaeafdb/volumes" Jan 31 09:23:46 crc kubenswrapper[4732]: I0131 09:23:46.552202 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e502d767-ed18-4540-8d2c-ffd993e4822d" path="/var/lib/kubelet/pods/e502d767-ed18-4540-8d2c-ffd993e4822d/volumes" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.511245 4732 configmap.go:193] Couldn't get configMap swift-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.512025 4732 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts podName:1654407d-7276-4839-839d-1244759c4ad2 nodeName:}" failed. No retries permitted until 2026-01-31 09:24:30.51199475 +0000 UTC m=+1408.817870984 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts") pod "keystone5b86-account-delete-qkt4p" (UID: "1654407d-7276-4839-839d-1244759c4ad2") : configmap "openstack-scripts" not found Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.576775 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4v4x4/must-gather-8vtn6"] Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577193 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" containerName="setup-container" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577214 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" containerName="setup-container" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577240 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7eb0179-b292-4a09-a07d-3d9bfe7978f3" containerName="mysql-bootstrap" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577250 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7eb0179-b292-4a09-a07d-3d9bfe7978f3" containerName="mysql-bootstrap" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577270 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0682a582-79d6-4286-9a43-e4a258dde73f" containerName="galera" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577279 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0682a582-79d6-4286-9a43-e4a258dde73f" containerName="galera" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577307 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff1fd2b-a8cb-4050-9a54-3117be6964ce" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577316 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff1fd2b-a8cb-4050-9a54-3117be6964ce" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577333 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e617e130-c338-40d1-9a5c-83e925a4e6ed" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577341 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e617e130-c338-40d1-9a5c-83e925a4e6ed" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577354 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e502d767-ed18-4540-8d2c-ffd993e4822d" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577362 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e502d767-ed18-4540-8d2c-ffd993e4822d" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577381 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" containerName="rabbitmq" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577390 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" containerName="rabbitmq" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577415 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerName="barbican-worker-log" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577424 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerName="barbican-worker-log" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577442 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="241eae26-3908-40e0-af9c-59b54a6ab1a0" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577450 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="241eae26-3908-40e0-af9c-59b54a6ab1a0" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577471 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577479 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577493 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577502 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577521 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d4fa62-a33c-4ab2-a446-697994c1541e" containerName="memcached" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577550 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d4fa62-a33c-4ab2-a446-697994c1541e" containerName="memcached" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577578 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7eb0179-b292-4a09-a07d-3d9bfe7978f3" containerName="galera" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577587 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7eb0179-b292-4a09-a07d-3d9bfe7978f3" containerName="galera" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577603 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b3a350-8b41-44a0-a6b4-b957947e1df6" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577611 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b3a350-8b41-44a0-a6b4-b957947e1df6" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577621 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d550c8-968a-4962-9e23-c0c22911913d" containerName="mariadb-account-delete" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577629 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d550c8-968a-4962-9e23-c0c22911913d" containerName="mariadb-account-delete" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577649 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef2e0aa-0782-4814-9d5f-9a6a32fb121b" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577657 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef2e0aa-0782-4814-9d5f-9a6a32fb121b" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577695 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="415e013e-ff2f-47b6-a17d-c0ba8f80071a" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577708 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="415e013e-ff2f-47b6-a17d-c0ba8f80071a" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577726 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerName="barbican-api-log" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577736 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerName="barbican-api-log" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577757 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616eedfe-830a-4ca8-9c42-a2cfd9352312" containerName="galera" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577766 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="616eedfe-830a-4ca8-9c42-a2cfd9352312" containerName="galera" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577785 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerName="barbican-worker" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577794 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerName="barbican-worker" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577817 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee84530-efd7-4d83-9aa2-fb9b8b178496" containerName="keystone-api" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577826 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee84530-efd7-4d83-9aa2-fb9b8b178496" containerName="keystone-api" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577844 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="088a3743-a071-4b0e-9cd8-66271eaeafdb" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577853 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="088a3743-a071-4b0e-9cd8-66271eaeafdb" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577863 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f93e90-1e9a-439c-a130-487ebf54ad10" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577871 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f93e90-1e9a-439c-a130-487ebf54ad10" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577894 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616eedfe-830a-4ca8-9c42-a2cfd9352312" containerName="mysql-bootstrap" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577902 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="616eedfe-830a-4ca8-9c42-a2cfd9352312" containerName="mysql-bootstrap" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577920 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerName="barbican-api" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577930 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerName="barbican-api" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577947 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0682a582-79d6-4286-9a43-e4a258dde73f" containerName="mysql-bootstrap" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577955 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="0682a582-79d6-4286-9a43-e4a258dde73f" containerName="mysql-bootstrap" Jan 31 09:23:58 crc kubenswrapper[4732]: E0131 09:23:58.577975 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79621d02-e834-4725-8b80-d0444f3b6487" containerName="operator" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.577988 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="79621d02-e834-4725-8b80-d0444f3b6487" containerName="operator" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578217 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerName="barbican-worker-log" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578229 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b3a350-8b41-44a0-a6b4-b957947e1df6" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578238 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a4e6dfd-8108-4ccb-8f6b-0d81bb1adebf" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578249 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7eb0179-b292-4a09-a07d-3d9bfe7978f3" containerName="galera" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578263 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d4fa62-a33c-4ab2-a446-697994c1541e" containerName="memcached" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578282 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d550c8-968a-4962-9e23-c0c22911913d" containerName="mariadb-account-delete" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578293 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="241eae26-3908-40e0-af9c-59b54a6ab1a0" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578308 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="24dbbc5e-c4aa-4f6a-b6e6-52b3013443cf" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578319 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="79621d02-e834-4725-8b80-d0444f3b6487" containerName="operator" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578337 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f93e90-1e9a-439c-a130-487ebf54ad10" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578351 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef2e0aa-0782-4814-9d5f-9a6a32fb121b" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578363 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e617e130-c338-40d1-9a5c-83e925a4e6ed" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578379 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc71a61f-ccf2-43bb-aedb-71f2ec9f03bd" containerName="rabbitmq" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578396 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="e502d767-ed18-4540-8d2c-ffd993e4822d" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578413 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="088a3743-a071-4b0e-9cd8-66271eaeafdb" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578427 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3b1cc40-9985-45d8-bb06-0676ff188c6c" containerName="barbican-worker" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578443 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="616eedfe-830a-4ca8-9c42-a2cfd9352312" containerName="galera" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578454 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="0682a582-79d6-4286-9a43-e4a258dde73f" containerName="galera" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578469 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee84530-efd7-4d83-9aa2-fb9b8b178496" containerName="keystone-api" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578485 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff1fd2b-a8cb-4050-9a54-3117be6964ce" containerName="registry-server" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578496 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="415e013e-ff2f-47b6-a17d-c0ba8f80071a" containerName="manager" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578514 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerName="barbican-api-log" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.578531 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="4981d9a9-898f-49ff-809d-58c7ca3bd2a3" containerName="barbican-api" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.585872 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.593430 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4v4x4"/"kube-root-ca.crt" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.594054 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4v4x4"/"default-dockercfg-hnm9l" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.594242 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4v4x4"/"openshift-service-ca.crt" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.600051 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4v4x4/must-gather-8vtn6"] Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.616811 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkkgd\" (UniqueName: \"kubernetes.io/projected/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-kube-api-access-bkkgd\") pod \"must-gather-8vtn6\" (UID: \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\") " pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.616888 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-must-gather-output\") pod \"must-gather-8vtn6\" (UID: \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\") " pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.719006 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkkgd\" (UniqueName: \"kubernetes.io/projected/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-kube-api-access-bkkgd\") pod \"must-gather-8vtn6\" (UID: \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\") " pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.719075 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-must-gather-output\") pod \"must-gather-8vtn6\" (UID: \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\") " pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.719741 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-must-gather-output\") pod \"must-gather-8vtn6\" (UID: \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\") " pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:23:58 crc kubenswrapper[4732]: I0131 09:23:58.741204 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkkgd\" (UniqueName: \"kubernetes.io/projected/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-kube-api-access-bkkgd\") pod \"must-gather-8vtn6\" (UID: \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\") " pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:23:59 crc kubenswrapper[4732]: I0131 09:23:59.084067 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:23:59 crc kubenswrapper[4732]: I0131 09:23:59.497434 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4v4x4/must-gather-8vtn6"] Jan 31 09:23:59 crc kubenswrapper[4732]: W0131 09:23:59.510384 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefcd56a7_a326_43ac_8d3e_c1a2fc2a464f.slice/crio-361c00622edab92e61d2b8d67ab19b41602e8592ca012dc381342782152982d7 WatchSource:0}: Error finding container 361c00622edab92e61d2b8d67ab19b41602e8592ca012dc381342782152982d7: Status 404 returned error can't find the container with id 361c00622edab92e61d2b8d67ab19b41602e8592ca012dc381342782152982d7 Jan 31 09:23:59 crc kubenswrapper[4732]: I0131 09:23:59.513275 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:24:00 crc kubenswrapper[4732]: I0131 09:24:00.139712 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" event={"ID":"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f","Type":"ContainerStarted","Data":"361c00622edab92e61d2b8d67ab19b41602e8592ca012dc381342782152982d7"} Jan 31 09:24:00 crc kubenswrapper[4732]: I0131 09:24:00.462149 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:24:00 crc kubenswrapper[4732]: I0131 09:24:00.604007 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssz6z\" (UniqueName: \"kubernetes.io/projected/1654407d-7276-4839-839d-1244759c4ad2-kube-api-access-ssz6z\") pod \"1654407d-7276-4839-839d-1244759c4ad2\" (UID: \"1654407d-7276-4839-839d-1244759c4ad2\") " Jan 31 09:24:00 crc kubenswrapper[4732]: I0131 09:24:00.604086 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts\") pod \"1654407d-7276-4839-839d-1244759c4ad2\" (UID: \"1654407d-7276-4839-839d-1244759c4ad2\") " Jan 31 09:24:00 crc kubenswrapper[4732]: I0131 09:24:00.604950 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1654407d-7276-4839-839d-1244759c4ad2" (UID: "1654407d-7276-4839-839d-1244759c4ad2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:24:00 crc kubenswrapper[4732]: I0131 09:24:00.619019 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1654407d-7276-4839-839d-1244759c4ad2-kube-api-access-ssz6z" (OuterVolumeSpecName: "kube-api-access-ssz6z") pod "1654407d-7276-4839-839d-1244759c4ad2" (UID: "1654407d-7276-4839-839d-1244759c4ad2"). InnerVolumeSpecName "kube-api-access-ssz6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:24:00 crc kubenswrapper[4732]: I0131 09:24:00.705600 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssz6z\" (UniqueName: \"kubernetes.io/projected/1654407d-7276-4839-839d-1244759c4ad2-kube-api-access-ssz6z\") on node \"crc\" DevicePath \"\"" Jan 31 09:24:00 crc kubenswrapper[4732]: I0131 09:24:00.705649 4732 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1654407d-7276-4839-839d-1244759c4ad2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 09:24:01 crc kubenswrapper[4732]: I0131 09:24:01.152267 4732 generic.go:334] "Generic (PLEG): container finished" podID="1654407d-7276-4839-839d-1244759c4ad2" containerID="79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069" exitCode=137 Jan 31 09:24:01 crc kubenswrapper[4732]: I0131 09:24:01.152306 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" event={"ID":"1654407d-7276-4839-839d-1244759c4ad2","Type":"ContainerDied","Data":"79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069"} Jan 31 09:24:01 crc kubenswrapper[4732]: I0131 09:24:01.152329 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" event={"ID":"1654407d-7276-4839-839d-1244759c4ad2","Type":"ContainerDied","Data":"7d6e2907463735b69f40b7370fe0069cdb743cf23d1f9aeb0278bee0ffa6f8b0"} Jan 31 09:24:01 crc kubenswrapper[4732]: I0131 09:24:01.152338 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone5b86-account-delete-qkt4p" Jan 31 09:24:01 crc kubenswrapper[4732]: I0131 09:24:01.152345 4732 scope.go:117] "RemoveContainer" containerID="79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069" Jan 31 09:24:01 crc kubenswrapper[4732]: I0131 09:24:01.168499 4732 scope.go:117] "RemoveContainer" containerID="79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069" Jan 31 09:24:01 crc kubenswrapper[4732]: E0131 09:24:01.168859 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069\": container with ID starting with 79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069 not found: ID does not exist" containerID="79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069" Jan 31 09:24:01 crc kubenswrapper[4732]: I0131 09:24:01.168888 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069"} err="failed to get container status \"79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069\": rpc error: code = NotFound desc = could not find container \"79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069\": container with ID starting with 79fac5d2960d6f9193efd13767f68129541f3654fafcc49472d1b08351983069 not found: ID does not exist" Jan 31 09:24:01 crc kubenswrapper[4732]: I0131 09:24:01.187264 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone5b86-account-delete-qkt4p"] Jan 31 09:24:01 crc kubenswrapper[4732]: I0131 09:24:01.193217 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone5b86-account-delete-qkt4p"] Jan 31 09:24:02 crc kubenswrapper[4732]: I0131 09:24:02.549295 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1654407d-7276-4839-839d-1244759c4ad2" path="/var/lib/kubelet/pods/1654407d-7276-4839-839d-1244759c4ad2/volumes" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.181351 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" event={"ID":"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f","Type":"ContainerStarted","Data":"f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854"} Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.181734 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" event={"ID":"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f","Type":"ContainerStarted","Data":"3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59"} Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.200138 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" podStartSLOduration=2.671481815 podStartE2EDuration="6.200119267s" podCreationTimestamp="2026-01-31 09:23:58 +0000 UTC" firstStartedPulling="2026-01-31 09:23:59.512449275 +0000 UTC m=+1377.818325479" lastFinishedPulling="2026-01-31 09:24:03.041086707 +0000 UTC m=+1381.346962931" observedRunningTime="2026-01-31 09:24:04.19505293 +0000 UTC m=+1382.500929164" watchObservedRunningTime="2026-01-31 09:24:04.200119267 +0000 UTC m=+1382.505995471" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.310958 4732 scope.go:117] "RemoveContainer" containerID="fecab232c8eea055dcb26c6531c6bec4d55c835988b0fcd4b2a823ee72e633d6" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.345248 4732 scope.go:117] "RemoveContainer" containerID="8c02dfae66fd3a4f5b4725177171b29a50b8756093929474b400347824b72530" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.368161 4732 scope.go:117] "RemoveContainer" containerID="f344ba5284e12746a74007d5adc8aa42e8ee48a5c3588cb47db5117f3c83e9f0" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.389272 4732 scope.go:117] "RemoveContainer" containerID="05dc77d92e7403bddf9c4216d3f9e9927975c83db475a4592b0fe670adbee115" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.417139 4732 scope.go:117] "RemoveContainer" containerID="90c15d5ab54a0813e658db668ab02d84908650e855cd0db9d18d9e71000d2b80" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.441606 4732 scope.go:117] "RemoveContainer" containerID="c9325e278cfd071d34b0338c159d59fb628048d898065c645b245c3766ab28e7" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.474894 4732 scope.go:117] "RemoveContainer" containerID="9e931ecd0e73f528e7ed41d0a730c73d763f5b49adf745bec9e51e72d6012d62" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.496281 4732 scope.go:117] "RemoveContainer" containerID="61b0a7c08542a13be8ba2b2cae47287416caa788b605c317245cad2aa213e84a" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.513485 4732 scope.go:117] "RemoveContainer" containerID="fe93f24b1b1ca8fe4c671105ac0fbeb376225843e31d3614ced260876fce3216" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.531528 4732 scope.go:117] "RemoveContainer" containerID="c0a354876c3578c412d4a67e39a312dc83add567ea01562dcdd52716d48fc242" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.566812 4732 scope.go:117] "RemoveContainer" containerID="a86707b4de5b35e9337d2beb5deb8d861dead56d2dc194d07926deea0b9a63f5" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.586482 4732 scope.go:117] "RemoveContainer" containerID="a890dd33402e340e3c5622818af3229b97a815440f243204f47c951187c50dbe" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.631374 4732 scope.go:117] "RemoveContainer" containerID="09dedf44a2244d0aa2f52cb5add6cbbb78a1014d9997dba9dbd36b2416c85acf" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.671321 4732 scope.go:117] "RemoveContainer" containerID="34e2cf0f4161fbb85eb0df7049a2e4f5c156fd2607d987c6bef858acc9fffa46" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.690637 4732 scope.go:117] "RemoveContainer" containerID="443c8d984e9ef1e7b308f390e27b580b627477703c6492f50cd0b13f5a986a0a" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.707651 4732 scope.go:117] "RemoveContainer" containerID="5797783d92fce2d40396a5e635eb5cba9fcbde2d8555d22fc23d15fbbb6337bb" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.721071 4732 scope.go:117] "RemoveContainer" containerID="47dfd350e39daf4b8f033d97a0ce4f52541999043f126251274978479fb72c51" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.735561 4732 scope.go:117] "RemoveContainer" containerID="e8c730305c3c39eb9242a11659fb49c02690f7b2e9c98fd70ae0da288de53e06" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.750844 4732 scope.go:117] "RemoveContainer" containerID="87f7881198e2bf8721939abbe73ccd62c3cbc8681018136ac024bf795609a719" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.764611 4732 scope.go:117] "RemoveContainer" containerID="381b993ee77c54308c8a7301a323a8aafc300c3e038f7b063863ce23feb4f43f" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.782431 4732 scope.go:117] "RemoveContainer" containerID="a5bd2e1a25f92dbcf8d8999d3428639a204c1c32490db7803f3573205b07d825" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.796889 4732 scope.go:117] "RemoveContainer" containerID="6e1beb997853403436be77394dbddcd82093917a762178f04e455efdc50a1f83" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.813017 4732 scope.go:117] "RemoveContainer" containerID="7a57753b9625c7248d75214d3dccdb94e632617b9f7c21be3bc87c9177d4ca52" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.830971 4732 scope.go:117] "RemoveContainer" containerID="75bdd5fe1a4f1b74d5a4dbcdde8f474e6bc05519374a21ed6a1e8f88735183f3" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.856293 4732 scope.go:117] "RemoveContainer" containerID="4b3511c94c5ac6d2254541268344c305851e16ace80321bcfccadaad9af571e5" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.883923 4732 scope.go:117] "RemoveContainer" containerID="a37225baa762efba0da855108c4e3b4942f286a7d0c711d1300d72e14f4f3a67" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.901823 4732 scope.go:117] "RemoveContainer" containerID="d5192da51c40249f5f8c71160cc71f8db81535a29bd24997e64ba0998bfe2e66" Jan 31 09:24:04 crc kubenswrapper[4732]: I0131 09:24:04.918542 4732 scope.go:117] "RemoveContainer" containerID="dddea144384fd97086387399ccac7f5e8ab364f8aa2e9d3b1d93f34363e9cf4d" Jan 31 09:24:53 crc kubenswrapper[4732]: I0131 09:24:53.214047 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v69mc_498d64fc-0d0f-43c6-aaae-bd3c5f0d7873/control-plane-machine-set-operator/0.log" Jan 31 09:24:53 crc kubenswrapper[4732]: I0131 09:24:53.379428 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-54nxd_e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a/kube-rbac-proxy/0.log" Jan 31 09:24:53 crc kubenswrapper[4732]: I0131 09:24:53.553015 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-54nxd_e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a/machine-api-operator/0.log" Jan 31 09:25:05 crc kubenswrapper[4732]: I0131 09:25:05.348486 4732 scope.go:117] "RemoveContainer" containerID="4202ea6b721143639a1a8ee464d8d295428596e358fa14506e850632ab62de19" Jan 31 09:25:17 crc kubenswrapper[4732]: I0131 09:25:17.497428 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:25:17 crc kubenswrapper[4732]: I0131 09:25:17.498132 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.443798 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-jq8g8_53b5272f-ac5c-4616-a427-28fc830d7392/controller/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.444039 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-jq8g8_53b5272f-ac5c-4616-a427-28fc830d7392/kube-rbac-proxy/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.570084 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-frr-files/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.742242 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-reloader/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.746186 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-reloader/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.776060 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-frr-files/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.777543 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-metrics/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.899805 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-reloader/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.919324 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-frr-files/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.992487 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-metrics/0.log" Jan 31 09:25:20 crc kubenswrapper[4732]: I0131 09:25:20.999490 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-metrics/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.094026 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-frr-files/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.121292 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-reloader/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.159779 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/controller/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.160446 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-metrics/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.291009 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/frr-metrics/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.351178 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/kube-rbac-proxy/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.364618 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/kube-rbac-proxy-frr/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.466979 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/reloader/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.580145 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-5l2kt_4b09b4ac-95c1-4c31-99a0-12b38c3412ae/frr-k8s-webhook-server/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.742054 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d8dc66c8b-8p2mn_8ca218dd-0d42-45c8-b4e4-ca638781c915/manager/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.801462 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c98699f55-6bzdf_62f950f6-2a18-4ca6-8cdb-75f47437053a/webhook-server/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.805991 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/frr/0.log" Jan 31 09:25:21 crc kubenswrapper[4732]: I0131 09:25:21.939131 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gcmq2_3fbb0c82-6b72-4313-94e2-3e71d27cf75f/kube-rbac-proxy/0.log" Jan 31 09:25:22 crc kubenswrapper[4732]: I0131 09:25:22.117491 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gcmq2_3fbb0c82-6b72-4313-94e2-3e71d27cf75f/speaker/0.log" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.600720 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nntc6"] Jan 31 09:25:37 crc kubenswrapper[4732]: E0131 09:25:37.601646 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1654407d-7276-4839-839d-1244759c4ad2" containerName="mariadb-account-delete" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.601694 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="1654407d-7276-4839-839d-1244759c4ad2" containerName="mariadb-account-delete" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.601901 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="1654407d-7276-4839-839d-1244759c4ad2" containerName="mariadb-account-delete" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.603204 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.608136 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nntc6"] Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.803400 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-catalog-content\") pod \"redhat-marketplace-nntc6\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.803928 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-utilities\") pod \"redhat-marketplace-nntc6\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.804037 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xppp\" (UniqueName: \"kubernetes.io/projected/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-kube-api-access-7xppp\") pod \"redhat-marketplace-nntc6\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.905180 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xppp\" (UniqueName: \"kubernetes.io/projected/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-kube-api-access-7xppp\") pod \"redhat-marketplace-nntc6\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.905261 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-catalog-content\") pod \"redhat-marketplace-nntc6\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.905302 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-utilities\") pod \"redhat-marketplace-nntc6\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.905786 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-utilities\") pod \"redhat-marketplace-nntc6\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.905888 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-catalog-content\") pod \"redhat-marketplace-nntc6\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:37 crc kubenswrapper[4732]: I0131 09:25:37.929486 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xppp\" (UniqueName: \"kubernetes.io/projected/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-kube-api-access-7xppp\") pod \"redhat-marketplace-nntc6\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:38 crc kubenswrapper[4732]: I0131 09:25:38.218280 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:38 crc kubenswrapper[4732]: I0131 09:25:38.660021 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nntc6"] Jan 31 09:25:38 crc kubenswrapper[4732]: I0131 09:25:38.744463 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nntc6" event={"ID":"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f","Type":"ContainerStarted","Data":"59a992697fbb9e4ba41759d7f9425dc054a231d68556433c6fe3cf106dbc6b83"} Jan 31 09:25:38 crc kubenswrapper[4732]: E0131 09:25:38.866993 4732 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e6555ab_3f72_4b8f_a2d6_56f918da2b5f.slice/crio-conmon-18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657.scope\": RecentStats: unable to find data in memory cache]" Jan 31 09:25:39 crc kubenswrapper[4732]: I0131 09:25:39.753326 4732 generic.go:334] "Generic (PLEG): container finished" podID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerID="18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657" exitCode=0 Jan 31 09:25:39 crc kubenswrapper[4732]: I0131 09:25:39.753372 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nntc6" event={"ID":"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f","Type":"ContainerDied","Data":"18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657"} Jan 31 09:25:41 crc kubenswrapper[4732]: I0131 09:25:41.763396 4732 generic.go:334] "Generic (PLEG): container finished" podID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerID="1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620" exitCode=0 Jan 31 09:25:41 crc kubenswrapper[4732]: I0131 09:25:41.763452 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nntc6" event={"ID":"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f","Type":"ContainerDied","Data":"1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620"} Jan 31 09:25:42 crc kubenswrapper[4732]: I0131 09:25:42.771629 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nntc6" event={"ID":"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f","Type":"ContainerStarted","Data":"6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007"} Jan 31 09:25:42 crc kubenswrapper[4732]: I0131 09:25:42.795090 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nntc6" podStartSLOduration=3.117546243 podStartE2EDuration="5.795065267s" podCreationTimestamp="2026-01-31 09:25:37 +0000 UTC" firstStartedPulling="2026-01-31 09:25:39.75495787 +0000 UTC m=+1478.060834074" lastFinishedPulling="2026-01-31 09:25:42.432476894 +0000 UTC m=+1480.738353098" observedRunningTime="2026-01-31 09:25:42.790226677 +0000 UTC m=+1481.096102881" watchObservedRunningTime="2026-01-31 09:25:42.795065267 +0000 UTC m=+1481.100941501" Jan 31 09:25:45 crc kubenswrapper[4732]: I0131 09:25:45.830514 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/util/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.024284 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/pull/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.037304 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/util/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.043481 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/pull/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.229859 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/extract/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.232581 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/pull/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.260399 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/util/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.367932 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-utilities/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.527435 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-utilities/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.533025 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-content/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.539734 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-content/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.681900 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-utilities/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.685426 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-content/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.912064 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-utilities/0.log" Jan 31 09:25:46 crc kubenswrapper[4732]: I0131 09:25:46.942787 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/registry-server/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.073255 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-content/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.109832 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-utilities/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.122812 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-content/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.286722 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-utilities/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.333311 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-content/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.487399 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8bchs_d0dbfc52-f4e9-462a-a253-2bb950c04e7b/marketplace-operator/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.497563 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.497685 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.581540 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-utilities/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.698373 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/registry-server/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.758839 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-content/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.766615 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-content/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.781853 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-utilities/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.918748 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-utilities/0.log" Jan 31 09:25:47 crc kubenswrapper[4732]: I0131 09:25:47.921002 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-content/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.032728 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/registry-server/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.111828 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nntc6_7e6555ab-3f72-4b8f-a2d6-56f918da2b5f/extract-utilities/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.218674 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.218719 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.225933 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nntc6_7e6555ab-3f72-4b8f-a2d6-56f918da2b5f/extract-utilities/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.237748 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nntc6_7e6555ab-3f72-4b8f-a2d6-56f918da2b5f/extract-content/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.261142 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nntc6_7e6555ab-3f72-4b8f-a2d6-56f918da2b5f/extract-content/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.263411 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.443951 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nntc6_7e6555ab-3f72-4b8f-a2d6-56f918da2b5f/extract-utilities/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.445273 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nntc6_7e6555ab-3f72-4b8f-a2d6-56f918da2b5f/extract-content/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.464415 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-nntc6_7e6555ab-3f72-4b8f-a2d6-56f918da2b5f/registry-server/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.606389 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-utilities/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.746240 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-utilities/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.762895 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-content/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.774778 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-content/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.849044 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.889201 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nntc6"] Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.936745 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-utilities/0.log" Jan 31 09:25:48 crc kubenswrapper[4732]: I0131 09:25:48.979521 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-content/0.log" Jan 31 09:25:49 crc kubenswrapper[4732]: I0131 09:25:49.328251 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/registry-server/0.log" Jan 31 09:25:50 crc kubenswrapper[4732]: I0131 09:25:50.817106 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nntc6" podUID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerName="registry-server" containerID="cri-o://6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007" gracePeriod=2 Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.305334 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.491632 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-catalog-content\") pod \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.498308 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xppp\" (UniqueName: \"kubernetes.io/projected/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-kube-api-access-7xppp\") pod \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.498450 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-utilities\") pod \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\" (UID: \"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f\") " Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.499889 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-utilities" (OuterVolumeSpecName: "utilities") pod "7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" (UID: "7e6555ab-3f72-4b8f-a2d6-56f918da2b5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.506789 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-kube-api-access-7xppp" (OuterVolumeSpecName: "kube-api-access-7xppp") pod "7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" (UID: "7e6555ab-3f72-4b8f-a2d6-56f918da2b5f"). InnerVolumeSpecName "kube-api-access-7xppp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.518629 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" (UID: "7e6555ab-3f72-4b8f-a2d6-56f918da2b5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.600367 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.600425 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.600441 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xppp\" (UniqueName: \"kubernetes.io/projected/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f-kube-api-access-7xppp\") on node \"crc\" DevicePath \"\"" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.829141 4732 generic.go:334] "Generic (PLEG): container finished" podID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerID="6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007" exitCode=0 Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.829196 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nntc6" event={"ID":"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f","Type":"ContainerDied","Data":"6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007"} Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.829225 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nntc6" event={"ID":"7e6555ab-3f72-4b8f-a2d6-56f918da2b5f","Type":"ContainerDied","Data":"59a992697fbb9e4ba41759d7f9425dc054a231d68556433c6fe3cf106dbc6b83"} Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.829245 4732 scope.go:117] "RemoveContainer" containerID="6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.829399 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nntc6" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.869759 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nntc6"] Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.876652 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nntc6"] Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.892046 4732 scope.go:117] "RemoveContainer" containerID="1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.913192 4732 scope.go:117] "RemoveContainer" containerID="18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.939962 4732 scope.go:117] "RemoveContainer" containerID="6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007" Jan 31 09:25:51 crc kubenswrapper[4732]: E0131 09:25:51.940651 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007\": container with ID starting with 6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007 not found: ID does not exist" containerID="6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.940728 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007"} err="failed to get container status \"6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007\": rpc error: code = NotFound desc = could not find container \"6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007\": container with ID starting with 6cd3e6a33b07ecf405db08db769e081eb479c7bf7936a8343db50d45e05dd007 not found: ID does not exist" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.940777 4732 scope.go:117] "RemoveContainer" containerID="1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620" Jan 31 09:25:51 crc kubenswrapper[4732]: E0131 09:25:51.942912 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620\": container with ID starting with 1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620 not found: ID does not exist" containerID="1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.942970 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620"} err="failed to get container status \"1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620\": rpc error: code = NotFound desc = could not find container \"1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620\": container with ID starting with 1b61c8f45b6d0b67474b1d360231736bc021a78f046b98c44defe0796d04c620 not found: ID does not exist" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.943012 4732 scope.go:117] "RemoveContainer" containerID="18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657" Jan 31 09:25:51 crc kubenswrapper[4732]: E0131 09:25:51.943410 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657\": container with ID starting with 18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657 not found: ID does not exist" containerID="18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657" Jan 31 09:25:51 crc kubenswrapper[4732]: I0131 09:25:51.943442 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657"} err="failed to get container status \"18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657\": rpc error: code = NotFound desc = could not find container \"18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657\": container with ID starting with 18db5903c8c2eb2a0f15e145c4028d64509fbc52c15e7320b0f4849da6f6b657 not found: ID does not exist" Jan 31 09:25:52 crc kubenswrapper[4732]: I0131 09:25:52.551489 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" path="/var/lib/kubelet/pods/7e6555ab-3f72-4b8f-a2d6-56f918da2b5f/volumes" Jan 31 09:26:05 crc kubenswrapper[4732]: I0131 09:26:05.404937 4732 scope.go:117] "RemoveContainer" containerID="2bf5f760920241f10f11d255f66394a5366cad9af6f5cf262601f44403aa7939" Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.816021 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-656dg"] Jan 31 09:26:07 crc kubenswrapper[4732]: E0131 09:26:07.816276 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerName="extract-utilities" Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.816292 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerName="extract-utilities" Jan 31 09:26:07 crc kubenswrapper[4732]: E0131 09:26:07.816325 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerName="registry-server" Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.816333 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerName="registry-server" Jan 31 09:26:07 crc kubenswrapper[4732]: E0131 09:26:07.816346 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerName="extract-content" Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.816354 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerName="extract-content" Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.816473 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6555ab-3f72-4b8f-a2d6-56f918da2b5f" containerName="registry-server" Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.817608 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.830559 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-656dg"] Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.909911 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-utilities\") pod \"redhat-operators-656dg\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.910027 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll2cw\" (UniqueName: \"kubernetes.io/projected/2e12f87a-681b-4268-a759-5a0043ce9b74-kube-api-access-ll2cw\") pod \"redhat-operators-656dg\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:07 crc kubenswrapper[4732]: I0131 09:26:07.910122 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-catalog-content\") pod \"redhat-operators-656dg\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.010890 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-utilities\") pod \"redhat-operators-656dg\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.010946 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll2cw\" (UniqueName: \"kubernetes.io/projected/2e12f87a-681b-4268-a759-5a0043ce9b74-kube-api-access-ll2cw\") pod \"redhat-operators-656dg\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.010984 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-catalog-content\") pod \"redhat-operators-656dg\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.011559 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-catalog-content\") pod \"redhat-operators-656dg\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.011571 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-utilities\") pod \"redhat-operators-656dg\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.036054 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll2cw\" (UniqueName: \"kubernetes.io/projected/2e12f87a-681b-4268-a759-5a0043ce9b74-kube-api-access-ll2cw\") pod \"redhat-operators-656dg\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.136117 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.328297 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-656dg"] Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.937566 4732 generic.go:334] "Generic (PLEG): container finished" podID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerID="c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e" exitCode=0 Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.937616 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-656dg" event={"ID":"2e12f87a-681b-4268-a759-5a0043ce9b74","Type":"ContainerDied","Data":"c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e"} Jan 31 09:26:08 crc kubenswrapper[4732]: I0131 09:26:08.937644 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-656dg" event={"ID":"2e12f87a-681b-4268-a759-5a0043ce9b74","Type":"ContainerStarted","Data":"29b157830fbef40d781d40bed5e3826609c906a763e5b72b90cee5d6d8cea24c"} Jan 31 09:26:09 crc kubenswrapper[4732]: I0131 09:26:09.944838 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-656dg" event={"ID":"2e12f87a-681b-4268-a759-5a0043ce9b74","Type":"ContainerStarted","Data":"7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057"} Jan 31 09:26:10 crc kubenswrapper[4732]: I0131 09:26:10.957102 4732 generic.go:334] "Generic (PLEG): container finished" podID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerID="7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057" exitCode=0 Jan 31 09:26:10 crc kubenswrapper[4732]: I0131 09:26:10.957555 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-656dg" event={"ID":"2e12f87a-681b-4268-a759-5a0043ce9b74","Type":"ContainerDied","Data":"7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057"} Jan 31 09:26:11 crc kubenswrapper[4732]: I0131 09:26:11.968464 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-656dg" event={"ID":"2e12f87a-681b-4268-a759-5a0043ce9b74","Type":"ContainerStarted","Data":"d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183"} Jan 31 09:26:11 crc kubenswrapper[4732]: I0131 09:26:11.996987 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-656dg" podStartSLOduration=2.4980721790000002 podStartE2EDuration="4.996968407s" podCreationTimestamp="2026-01-31 09:26:07 +0000 UTC" firstStartedPulling="2026-01-31 09:26:08.939425417 +0000 UTC m=+1507.245301631" lastFinishedPulling="2026-01-31 09:26:11.438321665 +0000 UTC m=+1509.744197859" observedRunningTime="2026-01-31 09:26:11.989679841 +0000 UTC m=+1510.295556045" watchObservedRunningTime="2026-01-31 09:26:11.996968407 +0000 UTC m=+1510.302844611" Jan 31 09:26:17 crc kubenswrapper[4732]: I0131 09:26:17.497784 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:26:17 crc kubenswrapper[4732]: I0131 09:26:17.498187 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:26:17 crc kubenswrapper[4732]: I0131 09:26:17.498255 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:26:17 crc kubenswrapper[4732]: I0131 09:26:17.499106 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77"} pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:26:17 crc kubenswrapper[4732]: I0131 09:26:17.499218 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" containerID="cri-o://3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" gracePeriod=600 Jan 31 09:26:18 crc kubenswrapper[4732]: I0131 09:26:18.136643 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:18 crc kubenswrapper[4732]: I0131 09:26:18.136716 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:18 crc kubenswrapper[4732]: I0131 09:26:18.204890 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:18 crc kubenswrapper[4732]: E0131 09:26:18.867836 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:26:19 crc kubenswrapper[4732]: I0131 09:26:19.012630 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d790207-d357-4b47-87bf-5b505e061820" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" exitCode=0 Jan 31 09:26:19 crc kubenswrapper[4732]: I0131 09:26:19.012733 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerDied","Data":"3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77"} Jan 31 09:26:19 crc kubenswrapper[4732]: I0131 09:26:19.012822 4732 scope.go:117] "RemoveContainer" containerID="777b6bb11b5556f90e1c2a08822928a50217112bcd9efce47de0d5e1a98e3392" Jan 31 09:26:19 crc kubenswrapper[4732]: I0131 09:26:19.013792 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:26:19 crc kubenswrapper[4732]: E0131 09:26:19.014171 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:26:19 crc kubenswrapper[4732]: I0131 09:26:19.082430 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:19 crc kubenswrapper[4732]: I0131 09:26:19.122516 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-656dg"] Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.025676 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-656dg" podUID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerName="registry-server" containerID="cri-o://d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183" gracePeriod=2 Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.271513 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xt2x6"] Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.272538 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.293777 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xt2x6"] Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.294763 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-utilities\") pod \"community-operators-xt2x6\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.294855 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq5d4\" (UniqueName: \"kubernetes.io/projected/9264cbfe-6b17-491e-8999-3a70e0198ca1-kube-api-access-zq5d4\") pod \"community-operators-xt2x6\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.294886 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-catalog-content\") pod \"community-operators-xt2x6\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.396202 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq5d4\" (UniqueName: \"kubernetes.io/projected/9264cbfe-6b17-491e-8999-3a70e0198ca1-kube-api-access-zq5d4\") pod \"community-operators-xt2x6\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.396256 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-catalog-content\") pod \"community-operators-xt2x6\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.396510 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-utilities\") pod \"community-operators-xt2x6\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.396640 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-catalog-content\") pod \"community-operators-xt2x6\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.396984 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-utilities\") pod \"community-operators-xt2x6\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.405736 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.420312 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq5d4\" (UniqueName: \"kubernetes.io/projected/9264cbfe-6b17-491e-8999-3a70e0198ca1-kube-api-access-zq5d4\") pod \"community-operators-xt2x6\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.598198 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-catalog-content\") pod \"2e12f87a-681b-4268-a759-5a0043ce9b74\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.598711 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-utilities\") pod \"2e12f87a-681b-4268-a759-5a0043ce9b74\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.598816 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll2cw\" (UniqueName: \"kubernetes.io/projected/2e12f87a-681b-4268-a759-5a0043ce9b74-kube-api-access-ll2cw\") pod \"2e12f87a-681b-4268-a759-5a0043ce9b74\" (UID: \"2e12f87a-681b-4268-a759-5a0043ce9b74\") " Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.599349 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-utilities" (OuterVolumeSpecName: "utilities") pod "2e12f87a-681b-4268-a759-5a0043ce9b74" (UID: "2e12f87a-681b-4268-a759-5a0043ce9b74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.599956 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.601548 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e12f87a-681b-4268-a759-5a0043ce9b74-kube-api-access-ll2cw" (OuterVolumeSpecName: "kube-api-access-ll2cw") pod "2e12f87a-681b-4268-a759-5a0043ce9b74" (UID: "2e12f87a-681b-4268-a759-5a0043ce9b74"). InnerVolumeSpecName "kube-api-access-ll2cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.616782 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.702125 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll2cw\" (UniqueName: \"kubernetes.io/projected/2e12f87a-681b-4268-a759-5a0043ce9b74-kube-api-access-ll2cw\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.721141 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e12f87a-681b-4268-a759-5a0043ce9b74" (UID: "2e12f87a-681b-4268-a759-5a0043ce9b74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.806310 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e12f87a-681b-4268-a759-5a0043ce9b74-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:21 crc kubenswrapper[4732]: I0131 09:26:21.859473 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xt2x6"] Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.035854 4732 generic.go:334] "Generic (PLEG): container finished" podID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerID="d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183" exitCode=0 Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.035917 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-656dg" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.035906 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-656dg" event={"ID":"2e12f87a-681b-4268-a759-5a0043ce9b74","Type":"ContainerDied","Data":"d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183"} Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.036520 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-656dg" event={"ID":"2e12f87a-681b-4268-a759-5a0043ce9b74","Type":"ContainerDied","Data":"29b157830fbef40d781d40bed5e3826609c906a763e5b72b90cee5d6d8cea24c"} Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.036546 4732 scope.go:117] "RemoveContainer" containerID="d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.041025 4732 generic.go:334] "Generic (PLEG): container finished" podID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerID="5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856" exitCode=0 Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.041063 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt2x6" event={"ID":"9264cbfe-6b17-491e-8999-3a70e0198ca1","Type":"ContainerDied","Data":"5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856"} Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.041098 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt2x6" event={"ID":"9264cbfe-6b17-491e-8999-3a70e0198ca1","Type":"ContainerStarted","Data":"3f6b4b05eaf571ca3a51b640a5f9c2f8c713991b9757e8306c7e848d7ac06420"} Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.101371 4732 scope.go:117] "RemoveContainer" containerID="7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.104719 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-656dg"] Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.108203 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-656dg"] Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.129886 4732 scope.go:117] "RemoveContainer" containerID="c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.161847 4732 scope.go:117] "RemoveContainer" containerID="d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183" Jan 31 09:26:22 crc kubenswrapper[4732]: E0131 09:26:22.162235 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183\": container with ID starting with d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183 not found: ID does not exist" containerID="d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.162269 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183"} err="failed to get container status \"d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183\": rpc error: code = NotFound desc = could not find container \"d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183\": container with ID starting with d895ab26fd561f11dab8cd4d8001e029f55ee41384747dfd0582712824be9183 not found: ID does not exist" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.162294 4732 scope.go:117] "RemoveContainer" containerID="7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057" Jan 31 09:26:22 crc kubenswrapper[4732]: E0131 09:26:22.162481 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057\": container with ID starting with 7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057 not found: ID does not exist" containerID="7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.162509 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057"} err="failed to get container status \"7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057\": rpc error: code = NotFound desc = could not find container \"7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057\": container with ID starting with 7bc43bc96f778289060427cb473897dee11bab3838f8edfd6f142f8bdc222057 not found: ID does not exist" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.162525 4732 scope.go:117] "RemoveContainer" containerID="c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e" Jan 31 09:26:22 crc kubenswrapper[4732]: E0131 09:26:22.162826 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e\": container with ID starting with c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e not found: ID does not exist" containerID="c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.162853 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e"} err="failed to get container status \"c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e\": rpc error: code = NotFound desc = could not find container \"c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e\": container with ID starting with c66b5574a75190dfa43ca1b2cdb9dfc8ae21408d31afedb9d3fc37979ae65b8e not found: ID does not exist" Jan 31 09:26:22 crc kubenswrapper[4732]: I0131 09:26:22.549823 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e12f87a-681b-4268-a759-5a0043ce9b74" path="/var/lib/kubelet/pods/2e12f87a-681b-4268-a759-5a0043ce9b74/volumes" Jan 31 09:26:23 crc kubenswrapper[4732]: I0131 09:26:23.051548 4732 generic.go:334] "Generic (PLEG): container finished" podID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerID="05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca" exitCode=0 Jan 31 09:26:23 crc kubenswrapper[4732]: I0131 09:26:23.051932 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt2x6" event={"ID":"9264cbfe-6b17-491e-8999-3a70e0198ca1","Type":"ContainerDied","Data":"05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca"} Jan 31 09:26:24 crc kubenswrapper[4732]: I0131 09:26:24.059677 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt2x6" event={"ID":"9264cbfe-6b17-491e-8999-3a70e0198ca1","Type":"ContainerStarted","Data":"f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e"} Jan 31 09:26:24 crc kubenswrapper[4732]: I0131 09:26:24.078744 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xt2x6" podStartSLOduration=1.702036823 podStartE2EDuration="3.078722764s" podCreationTimestamp="2026-01-31 09:26:21 +0000 UTC" firstStartedPulling="2026-01-31 09:26:22.042654752 +0000 UTC m=+1520.348530956" lastFinishedPulling="2026-01-31 09:26:23.419340693 +0000 UTC m=+1521.725216897" observedRunningTime="2026-01-31 09:26:24.07600767 +0000 UTC m=+1522.381883874" watchObservedRunningTime="2026-01-31 09:26:24.078722764 +0000 UTC m=+1522.384598968" Jan 31 09:26:31 crc kubenswrapper[4732]: I0131 09:26:31.618048 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:31 crc kubenswrapper[4732]: I0131 09:26:31.619406 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:31 crc kubenswrapper[4732]: I0131 09:26:31.659975 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:32 crc kubenswrapper[4732]: I0131 09:26:32.134566 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:32 crc kubenswrapper[4732]: I0131 09:26:32.173830 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xt2x6"] Jan 31 09:26:32 crc kubenswrapper[4732]: I0131 09:26:32.545084 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:26:32 crc kubenswrapper[4732]: E0131 09:26:32.545313 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:26:34 crc kubenswrapper[4732]: I0131 09:26:34.112521 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xt2x6" podUID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerName="registry-server" containerID="cri-o://f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e" gracePeriod=2 Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.069992 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.120203 4732 generic.go:334] "Generic (PLEG): container finished" podID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerID="f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e" exitCode=0 Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.120247 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt2x6" event={"ID":"9264cbfe-6b17-491e-8999-3a70e0198ca1","Type":"ContainerDied","Data":"f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e"} Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.120276 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xt2x6" event={"ID":"9264cbfe-6b17-491e-8999-3a70e0198ca1","Type":"ContainerDied","Data":"3f6b4b05eaf571ca3a51b640a5f9c2f8c713991b9757e8306c7e848d7ac06420"} Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.120297 4732 scope.go:117] "RemoveContainer" containerID="f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.120424 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xt2x6" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.142234 4732 scope.go:117] "RemoveContainer" containerID="05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.173941 4732 scope.go:117] "RemoveContainer" containerID="5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.190927 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-catalog-content\") pod \"9264cbfe-6b17-491e-8999-3a70e0198ca1\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.191035 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq5d4\" (UniqueName: \"kubernetes.io/projected/9264cbfe-6b17-491e-8999-3a70e0198ca1-kube-api-access-zq5d4\") pod \"9264cbfe-6b17-491e-8999-3a70e0198ca1\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.191064 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-utilities\") pod \"9264cbfe-6b17-491e-8999-3a70e0198ca1\" (UID: \"9264cbfe-6b17-491e-8999-3a70e0198ca1\") " Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.194202 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-utilities" (OuterVolumeSpecName: "utilities") pod "9264cbfe-6b17-491e-8999-3a70e0198ca1" (UID: "9264cbfe-6b17-491e-8999-3a70e0198ca1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.196227 4732 scope.go:117] "RemoveContainer" containerID="f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.197641 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9264cbfe-6b17-491e-8999-3a70e0198ca1-kube-api-access-zq5d4" (OuterVolumeSpecName: "kube-api-access-zq5d4") pod "9264cbfe-6b17-491e-8999-3a70e0198ca1" (UID: "9264cbfe-6b17-491e-8999-3a70e0198ca1"). InnerVolumeSpecName "kube-api-access-zq5d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:26:35 crc kubenswrapper[4732]: E0131 09:26:35.200903 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e\": container with ID starting with f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e not found: ID does not exist" containerID="f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.200949 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e"} err="failed to get container status \"f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e\": rpc error: code = NotFound desc = could not find container \"f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e\": container with ID starting with f97d2ae7ea12909284eb1b47f90cd95044a086c0e55c39c0f06f67810e91494e not found: ID does not exist" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.200973 4732 scope.go:117] "RemoveContainer" containerID="05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca" Jan 31 09:26:35 crc kubenswrapper[4732]: E0131 09:26:35.201282 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca\": container with ID starting with 05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca not found: ID does not exist" containerID="05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.201300 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca"} err="failed to get container status \"05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca\": rpc error: code = NotFound desc = could not find container \"05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca\": container with ID starting with 05fbbfd604512929fe0e74e29581f9d3ad75ec050effbbc8ae7081eb80ec46ca not found: ID does not exist" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.201312 4732 scope.go:117] "RemoveContainer" containerID="5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856" Jan 31 09:26:35 crc kubenswrapper[4732]: E0131 09:26:35.201515 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856\": container with ID starting with 5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856 not found: ID does not exist" containerID="5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.201531 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856"} err="failed to get container status \"5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856\": rpc error: code = NotFound desc = could not find container \"5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856\": container with ID starting with 5b7a52f67c8b8f70cc049c374dda98d143759e342ba44ccaaf714b2ba48aa856 not found: ID does not exist" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.237899 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9264cbfe-6b17-491e-8999-3a70e0198ca1" (UID: "9264cbfe-6b17-491e-8999-3a70e0198ca1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.291909 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq5d4\" (UniqueName: \"kubernetes.io/projected/9264cbfe-6b17-491e-8999-3a70e0198ca1-kube-api-access-zq5d4\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.291938 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.291947 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9264cbfe-6b17-491e-8999-3a70e0198ca1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.459717 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xt2x6"] Jan 31 09:26:35 crc kubenswrapper[4732]: I0131 09:26:35.464307 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xt2x6"] Jan 31 09:26:36 crc kubenswrapper[4732]: I0131 09:26:36.552605 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9264cbfe-6b17-491e-8999-3a70e0198ca1" path="/var/lib/kubelet/pods/9264cbfe-6b17-491e-8999-3a70e0198ca1/volumes" Jan 31 09:26:47 crc kubenswrapper[4732]: I0131 09:26:47.543837 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:26:47 crc kubenswrapper[4732]: E0131 09:26:47.544916 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:27:02 crc kubenswrapper[4732]: I0131 09:27:02.326555 4732 generic.go:334] "Generic (PLEG): container finished" podID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" containerID="3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59" exitCode=0 Jan 31 09:27:02 crc kubenswrapper[4732]: I0131 09:27:02.326631 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" event={"ID":"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f","Type":"ContainerDied","Data":"3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59"} Jan 31 09:27:02 crc kubenswrapper[4732]: I0131 09:27:02.327927 4732 scope.go:117] "RemoveContainer" containerID="3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59" Jan 31 09:27:02 crc kubenswrapper[4732]: I0131 09:27:02.511533 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4v4x4_must-gather-8vtn6_efcd56a7-a326-43ac-8d3e-c1a2fc2a464f/gather/0.log" Jan 31 09:27:02 crc kubenswrapper[4732]: I0131 09:27:02.553620 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:27:02 crc kubenswrapper[4732]: E0131 09:27:02.554415 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.478534 4732 scope.go:117] "RemoveContainer" containerID="5e80e6297ed4cb6407583fc9a3cefb0d406579b8e762ae1e494844cf4e75d919" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.512192 4732 scope.go:117] "RemoveContainer" containerID="6e7852a660c0fcf3225ca272b8e2af7fc11735fb24d2d14460cde6438a7beb2d" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.526405 4732 scope.go:117] "RemoveContainer" containerID="1afc72dc67226efea88d83cce600c96a720e1c283b758b093373dafe9a1d70f3" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.562296 4732 scope.go:117] "RemoveContainer" containerID="3a8ebcfcb039bb39617fcee0e053ba202cf3bccf0891372d90f2647abbe5c1ea" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.582889 4732 scope.go:117] "RemoveContainer" containerID="fd00da0aa47cdd0410c40f7add08c30b0950cfafc21a202b2619d006f368871b" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.599281 4732 scope.go:117] "RemoveContainer" containerID="655a856811f2b949bc2c94495ba00a853818330f4ec09b9aeffd1751288aff79" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.613755 4732 scope.go:117] "RemoveContainer" containerID="039961fef86b21e927d5ebdf1e2eb67c58775d812f1269411f3f0411c895a43f" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.627009 4732 scope.go:117] "RemoveContainer" containerID="8f37816afb654d1e776d0d4eec1d440d5af5c0af6bb7163bf1822d00c7129008" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.639521 4732 scope.go:117] "RemoveContainer" containerID="78f74f08eb824105edcd5f1f9501b3fa915219d39b763fb391f67d7c1c054292" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.668743 4732 scope.go:117] "RemoveContainer" containerID="186ff48385c2286a698108d59672234436dfc7dbb5bac5e777070affa544b217" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.686882 4732 scope.go:117] "RemoveContainer" containerID="f7bb7853880d63c47b634d75defb148b4ed41cf77de6d39f02020df2ff5b03d3" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.711513 4732 scope.go:117] "RemoveContainer" containerID="61d667d399369be98c22e8ccc9f3b4cb18a2a2b489b34f036b2fc88c0e71e429" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.731451 4732 scope.go:117] "RemoveContainer" containerID="29c732c7d142dfca9f679c8ac3af3f06cbb08a8c2b215575b2a3b5e1d907c9bb" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.757555 4732 scope.go:117] "RemoveContainer" containerID="ada6f8ad69317a784118495c96beb36d701ce7b36209a86b9a936d7cc110e3c2" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.775820 4732 scope.go:117] "RemoveContainer" containerID="8d252efa39cfce425e89f4670019c9177c3b0c93daee21298b31748b98d341f0" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.798529 4732 scope.go:117] "RemoveContainer" containerID="d31d074c83b389c8db51d7fd48db465e4e9e513ed6084f0c730eaa69444cb6c5" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.826441 4732 scope.go:117] "RemoveContainer" containerID="b50b39b84c6087e4699184efc06a407c325204a64a826bce844fec4c5164858e" Jan 31 09:27:05 crc kubenswrapper[4732]: I0131 09:27:05.849210 4732 scope.go:117] "RemoveContainer" containerID="420022f2e68e290b989b3165de84484bcc24a80cd43dfb94fa6e26433aff9a55" Jan 31 09:27:09 crc kubenswrapper[4732]: I0131 09:27:09.671371 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4v4x4/must-gather-8vtn6"] Jan 31 09:27:09 crc kubenswrapper[4732]: I0131 09:27:09.672616 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" podUID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" containerName="copy" containerID="cri-o://f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854" gracePeriod=2 Jan 31 09:27:09 crc kubenswrapper[4732]: I0131 09:27:09.675778 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4v4x4/must-gather-8vtn6"] Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.041915 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4v4x4_must-gather-8vtn6_efcd56a7-a326-43ac-8d3e-c1a2fc2a464f/copy/0.log" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.042730 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.208331 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-must-gather-output\") pod \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\" (UID: \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\") " Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.208502 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkkgd\" (UniqueName: \"kubernetes.io/projected/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-kube-api-access-bkkgd\") pod \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\" (UID: \"efcd56a7-a326-43ac-8d3e-c1a2fc2a464f\") " Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.223597 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-kube-api-access-bkkgd" (OuterVolumeSpecName: "kube-api-access-bkkgd") pod "efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" (UID: "efcd56a7-a326-43ac-8d3e-c1a2fc2a464f"). InnerVolumeSpecName "kube-api-access-bkkgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.290100 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" (UID: "efcd56a7-a326-43ac-8d3e-c1a2fc2a464f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.309403 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkkgd\" (UniqueName: \"kubernetes.io/projected/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-kube-api-access-bkkgd\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.309757 4732 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.394278 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4v4x4_must-gather-8vtn6_efcd56a7-a326-43ac-8d3e-c1a2fc2a464f/copy/0.log" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.395086 4732 generic.go:334] "Generic (PLEG): container finished" podID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" containerID="f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854" exitCode=143 Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.395142 4732 scope.go:117] "RemoveContainer" containerID="f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.395145 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4v4x4/must-gather-8vtn6" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.415901 4732 scope.go:117] "RemoveContainer" containerID="3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.454795 4732 scope.go:117] "RemoveContainer" containerID="f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854" Jan 31 09:27:10 crc kubenswrapper[4732]: E0131 09:27:10.455333 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854\": container with ID starting with f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854 not found: ID does not exist" containerID="f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.455381 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854"} err="failed to get container status \"f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854\": rpc error: code = NotFound desc = could not find container \"f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854\": container with ID starting with f88858a19d913954a48096ea1df3d5c79e72769bf4ad9e1829233f60113d0854 not found: ID does not exist" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.455409 4732 scope.go:117] "RemoveContainer" containerID="3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59" Jan 31 09:27:10 crc kubenswrapper[4732]: E0131 09:27:10.455684 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59\": container with ID starting with 3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59 not found: ID does not exist" containerID="3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.455706 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59"} err="failed to get container status \"3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59\": rpc error: code = NotFound desc = could not find container \"3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59\": container with ID starting with 3c1329e6bf2dfb982396045a78d5b23bda418913a472d6b57856329382998b59 not found: ID does not exist" Jan 31 09:27:10 crc kubenswrapper[4732]: I0131 09:27:10.555636 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" path="/var/lib/kubelet/pods/efcd56a7-a326-43ac-8d3e-c1a2fc2a464f/volumes" Jan 31 09:27:17 crc kubenswrapper[4732]: I0131 09:27:17.543125 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:27:17 crc kubenswrapper[4732]: E0131 09:27:17.543726 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:27:31 crc kubenswrapper[4732]: I0131 09:27:31.542580 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:27:31 crc kubenswrapper[4732]: E0131 09:27:31.543384 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:27:44 crc kubenswrapper[4732]: I0131 09:27:44.543825 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:27:44 crc kubenswrapper[4732]: E0131 09:27:44.545139 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:27:55 crc kubenswrapper[4732]: I0131 09:27:55.543217 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:27:55 crc kubenswrapper[4732]: E0131 09:27:55.543985 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:28:05 crc kubenswrapper[4732]: I0131 09:28:05.942727 4732 scope.go:117] "RemoveContainer" containerID="85b861719f4e2c096ba302360733974fc49e5be1c8a6dc54dda1f149625db608" Jan 31 09:28:09 crc kubenswrapper[4732]: I0131 09:28:09.542733 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:28:09 crc kubenswrapper[4732]: E0131 09:28:09.543445 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:28:20 crc kubenswrapper[4732]: I0131 09:28:20.543007 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:28:20 crc kubenswrapper[4732]: E0131 09:28:20.544132 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:28:31 crc kubenswrapper[4732]: I0131 09:28:31.543225 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:28:31 crc kubenswrapper[4732]: E0131 09:28:31.544324 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:28:43 crc kubenswrapper[4732]: I0131 09:28:43.542409 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:28:43 crc kubenswrapper[4732]: E0131 09:28:43.543449 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:28:55 crc kubenswrapper[4732]: I0131 09:28:55.542260 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:28:55 crc kubenswrapper[4732]: E0131 09:28:55.542969 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:29:06 crc kubenswrapper[4732]: I0131 09:29:05.999791 4732 scope.go:117] "RemoveContainer" containerID="e9a704782d501296be317dbceec99e3b7ae21d704b38e927a87174fe7c4bd3f1" Jan 31 09:29:08 crc kubenswrapper[4732]: I0131 09:29:08.543535 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:29:08 crc kubenswrapper[4732]: E0131 09:29:08.544000 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:29:23 crc kubenswrapper[4732]: I0131 09:29:23.542305 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:29:23 crc kubenswrapper[4732]: E0131 09:29:23.543328 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:29:35 crc kubenswrapper[4732]: I0131 09:29:35.543384 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:29:35 crc kubenswrapper[4732]: E0131 09:29:35.544418 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.517557 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qcgvm/must-gather-dmxpx"] Jan 31 09:29:46 crc kubenswrapper[4732]: E0131 09:29:46.518493 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerName="extract-utilities" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518507 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerName="extract-utilities" Jan 31 09:29:46 crc kubenswrapper[4732]: E0131 09:29:46.518521 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerName="extract-content" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518530 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerName="extract-content" Jan 31 09:29:46 crc kubenswrapper[4732]: E0131 09:29:46.518542 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" containerName="gather" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518550 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" containerName="gather" Jan 31 09:29:46 crc kubenswrapper[4732]: E0131 09:29:46.518561 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerName="registry-server" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518568 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerName="registry-server" Jan 31 09:29:46 crc kubenswrapper[4732]: E0131 09:29:46.518585 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerName="extract-utilities" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518592 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerName="extract-utilities" Jan 31 09:29:46 crc kubenswrapper[4732]: E0131 09:29:46.518612 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerName="extract-content" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518620 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerName="extract-content" Jan 31 09:29:46 crc kubenswrapper[4732]: E0131 09:29:46.518635 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerName="registry-server" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518643 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerName="registry-server" Jan 31 09:29:46 crc kubenswrapper[4732]: E0131 09:29:46.518654 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" containerName="copy" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518686 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" containerName="copy" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518805 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e12f87a-681b-4268-a759-5a0043ce9b74" containerName="registry-server" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518822 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" containerName="gather" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518840 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcd56a7-a326-43ac-8d3e-c1a2fc2a464f" containerName="copy" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.518851 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="9264cbfe-6b17-491e-8999-3a70e0198ca1" containerName="registry-server" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.519603 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.523196 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qcgvm"/"kube-root-ca.crt" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.523428 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qcgvm"/"openshift-service-ca.crt" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.523575 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qcgvm"/"default-dockercfg-28lch" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.543386 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcgvm/must-gather-dmxpx"] Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.614287 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b48d1b94-13fa-4c78-8391-6c54f00c4049-must-gather-output\") pod \"must-gather-dmxpx\" (UID: \"b48d1b94-13fa-4c78-8391-6c54f00c4049\") " pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.614347 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnsls\" (UniqueName: \"kubernetes.io/projected/b48d1b94-13fa-4c78-8391-6c54f00c4049-kube-api-access-qnsls\") pod \"must-gather-dmxpx\" (UID: \"b48d1b94-13fa-4c78-8391-6c54f00c4049\") " pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.715052 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnsls\" (UniqueName: \"kubernetes.io/projected/b48d1b94-13fa-4c78-8391-6c54f00c4049-kube-api-access-qnsls\") pod \"must-gather-dmxpx\" (UID: \"b48d1b94-13fa-4c78-8391-6c54f00c4049\") " pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.715164 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b48d1b94-13fa-4c78-8391-6c54f00c4049-must-gather-output\") pod \"must-gather-dmxpx\" (UID: \"b48d1b94-13fa-4c78-8391-6c54f00c4049\") " pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.715634 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b48d1b94-13fa-4c78-8391-6c54f00c4049-must-gather-output\") pod \"must-gather-dmxpx\" (UID: \"b48d1b94-13fa-4c78-8391-6c54f00c4049\") " pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.737300 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnsls\" (UniqueName: \"kubernetes.io/projected/b48d1b94-13fa-4c78-8391-6c54f00c4049-kube-api-access-qnsls\") pod \"must-gather-dmxpx\" (UID: \"b48d1b94-13fa-4c78-8391-6c54f00c4049\") " pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:29:46 crc kubenswrapper[4732]: I0131 09:29:46.844830 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:29:47 crc kubenswrapper[4732]: I0131 09:29:47.057242 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcgvm/must-gather-dmxpx"] Jan 31 09:29:47 crc kubenswrapper[4732]: I0131 09:29:47.413549 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" event={"ID":"b48d1b94-13fa-4c78-8391-6c54f00c4049","Type":"ContainerStarted","Data":"8bc8af864159428007839c4433582dfc653e8b215dbb29387accda4214427352"} Jan 31 09:29:48 crc kubenswrapper[4732]: I0131 09:29:48.421144 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" event={"ID":"b48d1b94-13fa-4c78-8391-6c54f00c4049","Type":"ContainerStarted","Data":"4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce"} Jan 31 09:29:48 crc kubenswrapper[4732]: I0131 09:29:48.421465 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" event={"ID":"b48d1b94-13fa-4c78-8391-6c54f00c4049","Type":"ContainerStarted","Data":"b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f"} Jan 31 09:29:48 crc kubenswrapper[4732]: I0131 09:29:48.442978 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" podStartSLOduration=2.442960176 podStartE2EDuration="2.442960176s" podCreationTimestamp="2026-01-31 09:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 09:29:48.43951127 +0000 UTC m=+1726.745387474" watchObservedRunningTime="2026-01-31 09:29:48.442960176 +0000 UTC m=+1726.748836380" Jan 31 09:29:50 crc kubenswrapper[4732]: I0131 09:29:50.545939 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:29:50 crc kubenswrapper[4732]: E0131 09:29:50.546107 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.134735 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z"] Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.136072 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.138119 4732 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.138387 4732 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.141040 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z"] Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.204457 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/892132d3-fb6d-4b47-b5ce-bfc23f479073-config-volume\") pod \"collect-profiles-29497530-s5s7z\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.204535 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr78j\" (UniqueName: \"kubernetes.io/projected/892132d3-fb6d-4b47-b5ce-bfc23f479073-kube-api-access-sr78j\") pod \"collect-profiles-29497530-s5s7z\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.204652 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/892132d3-fb6d-4b47-b5ce-bfc23f479073-secret-volume\") pod \"collect-profiles-29497530-s5s7z\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.305887 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/892132d3-fb6d-4b47-b5ce-bfc23f479073-config-volume\") pod \"collect-profiles-29497530-s5s7z\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.305962 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr78j\" (UniqueName: \"kubernetes.io/projected/892132d3-fb6d-4b47-b5ce-bfc23f479073-kube-api-access-sr78j\") pod \"collect-profiles-29497530-s5s7z\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.306011 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/892132d3-fb6d-4b47-b5ce-bfc23f479073-secret-volume\") pod \"collect-profiles-29497530-s5s7z\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.306805 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/892132d3-fb6d-4b47-b5ce-bfc23f479073-config-volume\") pod \"collect-profiles-29497530-s5s7z\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.313229 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/892132d3-fb6d-4b47-b5ce-bfc23f479073-secret-volume\") pod \"collect-profiles-29497530-s5s7z\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.325258 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr78j\" (UniqueName: \"kubernetes.io/projected/892132d3-fb6d-4b47-b5ce-bfc23f479073-kube-api-access-sr78j\") pod \"collect-profiles-29497530-s5s7z\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.454546 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:00 crc kubenswrapper[4732]: I0131 09:30:00.634321 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z"] Jan 31 09:30:00 crc kubenswrapper[4732]: W0131 09:30:00.638888 4732 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod892132d3_fb6d_4b47_b5ce_bfc23f479073.slice/crio-361f692802a3d42aa6b410360102991144ad2a93a90ae30a7682dcca1952457c WatchSource:0}: Error finding container 361f692802a3d42aa6b410360102991144ad2a93a90ae30a7682dcca1952457c: Status 404 returned error can't find the container with id 361f692802a3d42aa6b410360102991144ad2a93a90ae30a7682dcca1952457c Jan 31 09:30:01 crc kubenswrapper[4732]: I0131 09:30:01.504979 4732 generic.go:334] "Generic (PLEG): container finished" podID="892132d3-fb6d-4b47-b5ce-bfc23f479073" containerID="69e5b29dc7561c0d73b6331e7d93ff19db8795725f1bad225b50b3477a51841a" exitCode=0 Jan 31 09:30:01 crc kubenswrapper[4732]: I0131 09:30:01.505054 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" event={"ID":"892132d3-fb6d-4b47-b5ce-bfc23f479073","Type":"ContainerDied","Data":"69e5b29dc7561c0d73b6331e7d93ff19db8795725f1bad225b50b3477a51841a"} Jan 31 09:30:01 crc kubenswrapper[4732]: I0131 09:30:01.505328 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" event={"ID":"892132d3-fb6d-4b47-b5ce-bfc23f479073","Type":"ContainerStarted","Data":"361f692802a3d42aa6b410360102991144ad2a93a90ae30a7682dcca1952457c"} Jan 31 09:30:02 crc kubenswrapper[4732]: I0131 09:30:02.760157 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:02 crc kubenswrapper[4732]: I0131 09:30:02.934800 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr78j\" (UniqueName: \"kubernetes.io/projected/892132d3-fb6d-4b47-b5ce-bfc23f479073-kube-api-access-sr78j\") pod \"892132d3-fb6d-4b47-b5ce-bfc23f479073\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " Jan 31 09:30:02 crc kubenswrapper[4732]: I0131 09:30:02.935123 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/892132d3-fb6d-4b47-b5ce-bfc23f479073-config-volume\") pod \"892132d3-fb6d-4b47-b5ce-bfc23f479073\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " Jan 31 09:30:02 crc kubenswrapper[4732]: I0131 09:30:02.935165 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/892132d3-fb6d-4b47-b5ce-bfc23f479073-secret-volume\") pod \"892132d3-fb6d-4b47-b5ce-bfc23f479073\" (UID: \"892132d3-fb6d-4b47-b5ce-bfc23f479073\") " Jan 31 09:30:02 crc kubenswrapper[4732]: I0131 09:30:02.935761 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/892132d3-fb6d-4b47-b5ce-bfc23f479073-config-volume" (OuterVolumeSpecName: "config-volume") pod "892132d3-fb6d-4b47-b5ce-bfc23f479073" (UID: "892132d3-fb6d-4b47-b5ce-bfc23f479073"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 09:30:02 crc kubenswrapper[4732]: I0131 09:30:02.939685 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892132d3-fb6d-4b47-b5ce-bfc23f479073-kube-api-access-sr78j" (OuterVolumeSpecName: "kube-api-access-sr78j") pod "892132d3-fb6d-4b47-b5ce-bfc23f479073" (UID: "892132d3-fb6d-4b47-b5ce-bfc23f479073"). InnerVolumeSpecName "kube-api-access-sr78j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:30:02 crc kubenswrapper[4732]: I0131 09:30:02.939919 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/892132d3-fb6d-4b47-b5ce-bfc23f479073-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "892132d3-fb6d-4b47-b5ce-bfc23f479073" (UID: "892132d3-fb6d-4b47-b5ce-bfc23f479073"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 09:30:03 crc kubenswrapper[4732]: I0131 09:30:03.036809 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr78j\" (UniqueName: \"kubernetes.io/projected/892132d3-fb6d-4b47-b5ce-bfc23f479073-kube-api-access-sr78j\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:03 crc kubenswrapper[4732]: I0131 09:30:03.036847 4732 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/892132d3-fb6d-4b47-b5ce-bfc23f479073-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:03 crc kubenswrapper[4732]: I0131 09:30:03.036858 4732 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/892132d3-fb6d-4b47-b5ce-bfc23f479073-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 09:30:03 crc kubenswrapper[4732]: I0131 09:30:03.518621 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" event={"ID":"892132d3-fb6d-4b47-b5ce-bfc23f479073","Type":"ContainerDied","Data":"361f692802a3d42aa6b410360102991144ad2a93a90ae30a7682dcca1952457c"} Jan 31 09:30:03 crc kubenswrapper[4732]: I0131 09:30:03.518698 4732 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="361f692802a3d42aa6b410360102991144ad2a93a90ae30a7682dcca1952457c" Jan 31 09:30:03 crc kubenswrapper[4732]: I0131 09:30:03.518766 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497530-s5s7z" Jan 31 09:30:04 crc kubenswrapper[4732]: I0131 09:30:04.543024 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:30:04 crc kubenswrapper[4732]: E0131 09:30:04.543235 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:30:06 crc kubenswrapper[4732]: I0131 09:30:06.041700 4732 scope.go:117] "RemoveContainer" containerID="def72532b530c78803f0cccd8c5a2d65a616e430cf1d92043ad3623133222585" Jan 31 09:30:15 crc kubenswrapper[4732]: I0131 09:30:15.543489 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:30:15 crc kubenswrapper[4732]: E0131 09:30:15.544164 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:30:26 crc kubenswrapper[4732]: I0131 09:30:26.543364 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:30:26 crc kubenswrapper[4732]: E0131 09:30:26.544073 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:30:35 crc kubenswrapper[4732]: I0131 09:30:35.373361 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v69mc_498d64fc-0d0f-43c6-aaae-bd3c5f0d7873/control-plane-machine-set-operator/0.log" Jan 31 09:30:35 crc kubenswrapper[4732]: I0131 09:30:35.557518 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-54nxd_e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a/kube-rbac-proxy/0.log" Jan 31 09:30:35 crc kubenswrapper[4732]: I0131 09:30:35.593979 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-54nxd_e9ce5ab1-4da9-47a4-83b9-3c7d8af6d55a/machine-api-operator/0.log" Jan 31 09:30:41 crc kubenswrapper[4732]: I0131 09:30:41.543118 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:30:41 crc kubenswrapper[4732]: E0131 09:30:41.543953 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:30:54 crc kubenswrapper[4732]: I0131 09:30:54.543330 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:30:54 crc kubenswrapper[4732]: E0131 09:30:54.544020 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.077582 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-jq8g8_53b5272f-ac5c-4616-a427-28fc830d7392/kube-rbac-proxy/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.102237 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-jq8g8_53b5272f-ac5c-4616-a427-28fc830d7392/controller/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.258619 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-frr-files/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.451800 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-frr-files/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.452018 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-metrics/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.473312 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-reloader/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.474564 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-reloader/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.653207 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-frr-files/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.674533 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-metrics/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.677537 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-metrics/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.721297 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-reloader/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.910785 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-reloader/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.910881 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-frr-files/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.910986 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/cp-metrics/0.log" Jan 31 09:31:03 crc kubenswrapper[4732]: I0131 09:31:03.920194 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/controller/0.log" Jan 31 09:31:04 crc kubenswrapper[4732]: I0131 09:31:04.088535 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/frr-metrics/0.log" Jan 31 09:31:04 crc kubenswrapper[4732]: I0131 09:31:04.104261 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/kube-rbac-proxy/0.log" Jan 31 09:31:04 crc kubenswrapper[4732]: I0131 09:31:04.114895 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/kube-rbac-proxy-frr/0.log" Jan 31 09:31:04 crc kubenswrapper[4732]: I0131 09:31:04.297096 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/reloader/0.log" Jan 31 09:31:04 crc kubenswrapper[4732]: I0131 09:31:04.322112 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-5l2kt_4b09b4ac-95c1-4c31-99a0-12b38c3412ae/frr-k8s-webhook-server/0.log" Jan 31 09:31:04 crc kubenswrapper[4732]: I0131 09:31:04.569335 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6xvqw_66e27417-1fb4-4ca9-b104-d3d335370f0d/frr/0.log" Jan 31 09:31:04 crc kubenswrapper[4732]: I0131 09:31:04.642764 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d8dc66c8b-8p2mn_8ca218dd-0d42-45c8-b4e4-ca638781c915/manager/0.log" Jan 31 09:31:04 crc kubenswrapper[4732]: I0131 09:31:04.773200 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c98699f55-6bzdf_62f950f6-2a18-4ca6-8cdb-75f47437053a/webhook-server/0.log" Jan 31 09:31:04 crc kubenswrapper[4732]: I0131 09:31:04.839835 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gcmq2_3fbb0c82-6b72-4313-94e2-3e71d27cf75f/kube-rbac-proxy/0.log" Jan 31 09:31:05 crc kubenswrapper[4732]: I0131 09:31:05.040621 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gcmq2_3fbb0c82-6b72-4313-94e2-3e71d27cf75f/speaker/0.log" Jan 31 09:31:07 crc kubenswrapper[4732]: I0131 09:31:07.543258 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:31:07 crc kubenswrapper[4732]: E0131 09:31:07.544309 4732 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-jnbt8_openshift-machine-config-operator(7d790207-d357-4b47-87bf-5b505e061820)\"" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" Jan 31 09:31:20 crc kubenswrapper[4732]: I0131 09:31:20.542110 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77" Jan 31 09:31:20 crc kubenswrapper[4732]: I0131 09:31:20.927953 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"9ab32081e481444933bd2595cf8dbdf7ca3690cb1b45450fb70f36da385fe1c0"} Jan 31 09:31:28 crc kubenswrapper[4732]: I0131 09:31:28.694008 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/util/0.log" Jan 31 09:31:28 crc kubenswrapper[4732]: I0131 09:31:28.921435 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/util/0.log" Jan 31 09:31:28 crc kubenswrapper[4732]: I0131 09:31:28.921877 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/pull/0.log" Jan 31 09:31:28 crc kubenswrapper[4732]: I0131 09:31:28.921896 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/pull/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.105715 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/util/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.125321 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/extract/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.152873 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcrl29v_76f99e73-f72c-4026-b43f-dcb9f20b554f/pull/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.261987 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-utilities/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.425209 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-utilities/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.444970 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-content/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.445999 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-content/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.590784 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-utilities/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.615955 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/extract-content/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.791266 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-utilities/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.950463 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-utilities/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.952572 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2h57x_9039963e-96e4-4b4d-abdd-79f0429da944/registry-server/0.log" Jan 31 09:31:29 crc kubenswrapper[4732]: I0131 09:31:29.988737 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-content/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.013928 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-content/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.177529 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-content/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.218054 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/extract-utilities/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.517285 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-8bchs_d0dbfc52-f4e9-462a-a253-2bb950c04e7b/marketplace-operator/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.530002 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-utilities/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.563175 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d7jxg_a7533049-a0d8-4488-bed6-2a9b28212061/registry-server/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.683841 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-utilities/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.700648 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-content/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.709120 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-content/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.843947 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-utilities/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.938312 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/extract-content/0.log" Jan 31 09:31:30 crc kubenswrapper[4732]: I0131 09:31:30.943523 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-krjtb_17e07aee-c4b1-4011-8442-c6dcfc4f415c/registry-server/0.log" Jan 31 09:31:31 crc kubenswrapper[4732]: I0131 09:31:31.031132 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-utilities/0.log" Jan 31 09:31:31 crc kubenswrapper[4732]: I0131 09:31:31.220529 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-utilities/0.log" Jan 31 09:31:31 crc kubenswrapper[4732]: I0131 09:31:31.221865 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-content/0.log" Jan 31 09:31:31 crc kubenswrapper[4732]: I0131 09:31:31.249577 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-content/0.log" Jan 31 09:31:31 crc kubenswrapper[4732]: I0131 09:31:31.435349 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-content/0.log" Jan 31 09:31:31 crc kubenswrapper[4732]: I0131 09:31:31.493528 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/extract-utilities/0.log" Jan 31 09:31:31 crc kubenswrapper[4732]: I0131 09:31:31.785861 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4pkzq_a39f958d-6d9b-4a4a-9ec9-cfb1f96b6f45/registry-server/0.log" Jan 31 09:32:41 crc kubenswrapper[4732]: I0131 09:32:41.432898 4732 generic.go:334] "Generic (PLEG): container finished" podID="b48d1b94-13fa-4c78-8391-6c54f00c4049" containerID="b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f" exitCode=0 Jan 31 09:32:41 crc kubenswrapper[4732]: I0131 09:32:41.432995 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" event={"ID":"b48d1b94-13fa-4c78-8391-6c54f00c4049","Type":"ContainerDied","Data":"b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f"} Jan 31 09:32:41 crc kubenswrapper[4732]: I0131 09:32:41.434272 4732 scope.go:117] "RemoveContainer" containerID="b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f" Jan 31 09:32:42 crc kubenswrapper[4732]: I0131 09:32:42.394006 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qcgvm_must-gather-dmxpx_b48d1b94-13fa-4c78-8391-6c54f00c4049/gather/0.log" Jan 31 09:32:51 crc kubenswrapper[4732]: I0131 09:32:51.988593 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qcgvm/must-gather-dmxpx"] Jan 31 09:32:51 crc kubenswrapper[4732]: I0131 09:32:51.989975 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" podUID="b48d1b94-13fa-4c78-8391-6c54f00c4049" containerName="copy" containerID="cri-o://4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce" gracePeriod=2 Jan 31 09:32:51 crc kubenswrapper[4732]: I0131 09:32:51.992893 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qcgvm/must-gather-dmxpx"] Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.317389 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qcgvm_must-gather-dmxpx_b48d1b94-13fa-4c78-8391-6c54f00c4049/copy/0.log" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.318095 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.424462 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnsls\" (UniqueName: \"kubernetes.io/projected/b48d1b94-13fa-4c78-8391-6c54f00c4049-kube-api-access-qnsls\") pod \"b48d1b94-13fa-4c78-8391-6c54f00c4049\" (UID: \"b48d1b94-13fa-4c78-8391-6c54f00c4049\") " Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.424554 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b48d1b94-13fa-4c78-8391-6c54f00c4049-must-gather-output\") pod \"b48d1b94-13fa-4c78-8391-6c54f00c4049\" (UID: \"b48d1b94-13fa-4c78-8391-6c54f00c4049\") " Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.430808 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b48d1b94-13fa-4c78-8391-6c54f00c4049-kube-api-access-qnsls" (OuterVolumeSpecName: "kube-api-access-qnsls") pod "b48d1b94-13fa-4c78-8391-6c54f00c4049" (UID: "b48d1b94-13fa-4c78-8391-6c54f00c4049"). InnerVolumeSpecName "kube-api-access-qnsls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.491763 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b48d1b94-13fa-4c78-8391-6c54f00c4049-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b48d1b94-13fa-4c78-8391-6c54f00c4049" (UID: "b48d1b94-13fa-4c78-8391-6c54f00c4049"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.505182 4732 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qcgvm_must-gather-dmxpx_b48d1b94-13fa-4c78-8391-6c54f00c4049/copy/0.log" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.505757 4732 generic.go:334] "Generic (PLEG): container finished" podID="b48d1b94-13fa-4c78-8391-6c54f00c4049" containerID="4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce" exitCode=143 Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.505806 4732 scope.go:117] "RemoveContainer" containerID="4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.505904 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcgvm/must-gather-dmxpx" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.524683 4732 scope.go:117] "RemoveContainer" containerID="b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.525967 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnsls\" (UniqueName: \"kubernetes.io/projected/b48d1b94-13fa-4c78-8391-6c54f00c4049-kube-api-access-qnsls\") on node \"crc\" DevicePath \"\"" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.526014 4732 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b48d1b94-13fa-4c78-8391-6c54f00c4049-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.554503 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b48d1b94-13fa-4c78-8391-6c54f00c4049" path="/var/lib/kubelet/pods/b48d1b94-13fa-4c78-8391-6c54f00c4049/volumes" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.582684 4732 scope.go:117] "RemoveContainer" containerID="4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce" Jan 31 09:32:52 crc kubenswrapper[4732]: E0131 09:32:52.584018 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce\": container with ID starting with 4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce not found: ID does not exist" containerID="4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.584067 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce"} err="failed to get container status \"4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce\": rpc error: code = NotFound desc = could not find container \"4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce\": container with ID starting with 4a95f2552040e23cb82237e42fc7d1dd8d4b744cffad543d8632a98ef3b52dce not found: ID does not exist" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.584101 4732 scope.go:117] "RemoveContainer" containerID="b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f" Jan 31 09:32:52 crc kubenswrapper[4732]: E0131 09:32:52.584430 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f\": container with ID starting with b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f not found: ID does not exist" containerID="b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f" Jan 31 09:32:52 crc kubenswrapper[4732]: I0131 09:32:52.584472 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f"} err="failed to get container status \"b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f\": rpc error: code = NotFound desc = could not find container \"b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f\": container with ID starting with b6141ba3580c93eb3c741c0bc8fb44dce9869080708fa256445df445377a390f not found: ID does not exist" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.802216 4732 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wlbwz"] Jan 31 09:32:54 crc kubenswrapper[4732]: E0131 09:32:54.802651 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48d1b94-13fa-4c78-8391-6c54f00c4049" containerName="copy" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.802690 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48d1b94-13fa-4c78-8391-6c54f00c4049" containerName="copy" Jan 31 09:32:54 crc kubenswrapper[4732]: E0131 09:32:54.802728 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892132d3-fb6d-4b47-b5ce-bfc23f479073" containerName="collect-profiles" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.802749 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="892132d3-fb6d-4b47-b5ce-bfc23f479073" containerName="collect-profiles" Jan 31 09:32:54 crc kubenswrapper[4732]: E0131 09:32:54.802768 4732 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48d1b94-13fa-4c78-8391-6c54f00c4049" containerName="gather" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.802782 4732 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48d1b94-13fa-4c78-8391-6c54f00c4049" containerName="gather" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.802954 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48d1b94-13fa-4c78-8391-6c54f00c4049" containerName="copy" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.802968 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48d1b94-13fa-4c78-8391-6c54f00c4049" containerName="gather" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.802983 4732 memory_manager.go:354] "RemoveStaleState removing state" podUID="892132d3-fb6d-4b47-b5ce-bfc23f479073" containerName="collect-profiles" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.805429 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.823449 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlbwz"] Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.966153 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-utilities\") pod \"certified-operators-wlbwz\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.966509 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-catalog-content\") pod \"certified-operators-wlbwz\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:54 crc kubenswrapper[4732]: I0131 09:32:54.966530 4732 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j228q\" (UniqueName: \"kubernetes.io/projected/d0101775-0b8f-47b2-acaf-c422b1a1188f-kube-api-access-j228q\") pod \"certified-operators-wlbwz\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:55 crc kubenswrapper[4732]: I0131 09:32:55.067740 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-utilities\") pod \"certified-operators-wlbwz\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:55 crc kubenswrapper[4732]: I0131 09:32:55.067883 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-catalog-content\") pod \"certified-operators-wlbwz\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:55 crc kubenswrapper[4732]: I0131 09:32:55.067915 4732 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j228q\" (UniqueName: \"kubernetes.io/projected/d0101775-0b8f-47b2-acaf-c422b1a1188f-kube-api-access-j228q\") pod \"certified-operators-wlbwz\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:55 crc kubenswrapper[4732]: I0131 09:32:55.068367 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-utilities\") pod \"certified-operators-wlbwz\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:55 crc kubenswrapper[4732]: I0131 09:32:55.068367 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-catalog-content\") pod \"certified-operators-wlbwz\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:55 crc kubenswrapper[4732]: I0131 09:32:55.092613 4732 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j228q\" (UniqueName: \"kubernetes.io/projected/d0101775-0b8f-47b2-acaf-c422b1a1188f-kube-api-access-j228q\") pod \"certified-operators-wlbwz\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:55 crc kubenswrapper[4732]: I0131 09:32:55.125510 4732 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:32:55 crc kubenswrapper[4732]: I0131 09:32:55.382336 4732 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wlbwz"] Jan 31 09:32:55 crc kubenswrapper[4732]: I0131 09:32:55.530161 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbwz" event={"ID":"d0101775-0b8f-47b2-acaf-c422b1a1188f","Type":"ContainerStarted","Data":"bc5ae176ccd3492572307abad461c21211d142fcaf11491ffb2592e0ae5c8294"} Jan 31 09:32:56 crc kubenswrapper[4732]: I0131 09:32:56.536426 4732 generic.go:334] "Generic (PLEG): container finished" podID="d0101775-0b8f-47b2-acaf-c422b1a1188f" containerID="3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c" exitCode=0 Jan 31 09:32:56 crc kubenswrapper[4732]: I0131 09:32:56.536504 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbwz" event={"ID":"d0101775-0b8f-47b2-acaf-c422b1a1188f","Type":"ContainerDied","Data":"3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c"} Jan 31 09:32:56 crc kubenswrapper[4732]: I0131 09:32:56.538189 4732 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 09:32:57 crc kubenswrapper[4732]: I0131 09:32:57.547016 4732 generic.go:334] "Generic (PLEG): container finished" podID="d0101775-0b8f-47b2-acaf-c422b1a1188f" containerID="813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800" exitCode=0 Jan 31 09:32:57 crc kubenswrapper[4732]: I0131 09:32:57.547081 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbwz" event={"ID":"d0101775-0b8f-47b2-acaf-c422b1a1188f","Type":"ContainerDied","Data":"813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800"} Jan 31 09:32:58 crc kubenswrapper[4732]: I0131 09:32:58.557154 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbwz" event={"ID":"d0101775-0b8f-47b2-acaf-c422b1a1188f","Type":"ContainerStarted","Data":"376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c"} Jan 31 09:32:58 crc kubenswrapper[4732]: I0131 09:32:58.580312 4732 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wlbwz" podStartSLOduration=3.156916318 podStartE2EDuration="4.5802903s" podCreationTimestamp="2026-01-31 09:32:54 +0000 UTC" firstStartedPulling="2026-01-31 09:32:56.53798164 +0000 UTC m=+1914.843857854" lastFinishedPulling="2026-01-31 09:32:57.961355622 +0000 UTC m=+1916.267231836" observedRunningTime="2026-01-31 09:32:58.577372161 +0000 UTC m=+1916.883248385" watchObservedRunningTime="2026-01-31 09:32:58.5802903 +0000 UTC m=+1916.886166515" Jan 31 09:33:05 crc kubenswrapper[4732]: I0131 09:33:05.126173 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:33:05 crc kubenswrapper[4732]: I0131 09:33:05.127337 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:33:05 crc kubenswrapper[4732]: I0131 09:33:05.185858 4732 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:33:05 crc kubenswrapper[4732]: I0131 09:33:05.651786 4732 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:33:05 crc kubenswrapper[4732]: I0131 09:33:05.695959 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlbwz"] Jan 31 09:33:07 crc kubenswrapper[4732]: I0131 09:33:07.624707 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wlbwz" podUID="d0101775-0b8f-47b2-acaf-c422b1a1188f" containerName="registry-server" containerID="cri-o://376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c" gracePeriod=2 Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.040174 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.178134 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-utilities\") pod \"d0101775-0b8f-47b2-acaf-c422b1a1188f\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.178750 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-catalog-content\") pod \"d0101775-0b8f-47b2-acaf-c422b1a1188f\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.178816 4732 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j228q\" (UniqueName: \"kubernetes.io/projected/d0101775-0b8f-47b2-acaf-c422b1a1188f-kube-api-access-j228q\") pod \"d0101775-0b8f-47b2-acaf-c422b1a1188f\" (UID: \"d0101775-0b8f-47b2-acaf-c422b1a1188f\") " Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.179241 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-utilities" (OuterVolumeSpecName: "utilities") pod "d0101775-0b8f-47b2-acaf-c422b1a1188f" (UID: "d0101775-0b8f-47b2-acaf-c422b1a1188f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.192980 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0101775-0b8f-47b2-acaf-c422b1a1188f-kube-api-access-j228q" (OuterVolumeSpecName: "kube-api-access-j228q") pod "d0101775-0b8f-47b2-acaf-c422b1a1188f" (UID: "d0101775-0b8f-47b2-acaf-c422b1a1188f"). InnerVolumeSpecName "kube-api-access-j228q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.280476 4732 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.280536 4732 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j228q\" (UniqueName: \"kubernetes.io/projected/d0101775-0b8f-47b2-acaf-c422b1a1188f-kube-api-access-j228q\") on node \"crc\" DevicePath \"\"" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.636920 4732 generic.go:334] "Generic (PLEG): container finished" podID="d0101775-0b8f-47b2-acaf-c422b1a1188f" containerID="376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c" exitCode=0 Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.637021 4732 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wlbwz" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.637014 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbwz" event={"ID":"d0101775-0b8f-47b2-acaf-c422b1a1188f","Type":"ContainerDied","Data":"376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c"} Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.637115 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wlbwz" event={"ID":"d0101775-0b8f-47b2-acaf-c422b1a1188f","Type":"ContainerDied","Data":"bc5ae176ccd3492572307abad461c21211d142fcaf11491ffb2592e0ae5c8294"} Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.637168 4732 scope.go:117] "RemoveContainer" containerID="376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.664309 4732 scope.go:117] "RemoveContainer" containerID="813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.699432 4732 scope.go:117] "RemoveContainer" containerID="3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.725439 4732 scope.go:117] "RemoveContainer" containerID="376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c" Jan 31 09:33:08 crc kubenswrapper[4732]: E0131 09:33:08.726170 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c\": container with ID starting with 376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c not found: ID does not exist" containerID="376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.726259 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c"} err="failed to get container status \"376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c\": rpc error: code = NotFound desc = could not find container \"376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c\": container with ID starting with 376a8a4ee9333d2dddafc2640b859ee72406f5f9c7eff661ffc6c4b286d9733c not found: ID does not exist" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.726310 4732 scope.go:117] "RemoveContainer" containerID="813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800" Jan 31 09:33:08 crc kubenswrapper[4732]: E0131 09:33:08.727198 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800\": container with ID starting with 813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800 not found: ID does not exist" containerID="813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.727238 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800"} err="failed to get container status \"813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800\": rpc error: code = NotFound desc = could not find container \"813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800\": container with ID starting with 813d4fb6a7493690373fc86d275ae4556c0741965b81814d717d6f7d58fd1800 not found: ID does not exist" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.727258 4732 scope.go:117] "RemoveContainer" containerID="3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c" Jan 31 09:33:08 crc kubenswrapper[4732]: E0131 09:33:08.727593 4732 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c\": container with ID starting with 3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c not found: ID does not exist" containerID="3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.727682 4732 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c"} err="failed to get container status \"3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c\": rpc error: code = NotFound desc = could not find container \"3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c\": container with ID starting with 3f03626663f3253bfec58f7ab74ad941afd3c3098e2de1f44f24082a354aa84c not found: ID does not exist" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.921064 4732 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0101775-0b8f-47b2-acaf-c422b1a1188f" (UID: "d0101775-0b8f-47b2-acaf-c422b1a1188f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.982272 4732 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wlbwz"] Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.989543 4732 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wlbwz"] Jan 31 09:33:08 crc kubenswrapper[4732]: I0131 09:33:08.990827 4732 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0101775-0b8f-47b2-acaf-c422b1a1188f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 09:33:10 crc kubenswrapper[4732]: I0131 09:33:10.553891 4732 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0101775-0b8f-47b2-acaf-c422b1a1188f" path="/var/lib/kubelet/pods/d0101775-0b8f-47b2-acaf-c422b1a1188f/volumes" Jan 31 09:33:47 crc kubenswrapper[4732]: I0131 09:33:47.498474 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:33:47 crc kubenswrapper[4732]: I0131 09:33:47.499405 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:34:17 crc kubenswrapper[4732]: I0131 09:34:17.498247 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:34:17 crc kubenswrapper[4732]: I0131 09:34:17.498943 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:34:47 crc kubenswrapper[4732]: I0131 09:34:47.498173 4732 patch_prober.go:28] interesting pod/machine-config-daemon-jnbt8 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 09:34:47 crc kubenswrapper[4732]: I0131 09:34:47.498839 4732 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 09:34:47 crc kubenswrapper[4732]: I0131 09:34:47.498914 4732 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" Jan 31 09:34:47 crc kubenswrapper[4732]: I0131 09:34:47.499954 4732 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ab32081e481444933bd2595cf8dbdf7ca3690cb1b45450fb70f36da385fe1c0"} pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 09:34:47 crc kubenswrapper[4732]: I0131 09:34:47.500055 4732 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" podUID="7d790207-d357-4b47-87bf-5b505e061820" containerName="machine-config-daemon" containerID="cri-o://9ab32081e481444933bd2595cf8dbdf7ca3690cb1b45450fb70f36da385fe1c0" gracePeriod=600 Jan 31 09:34:48 crc kubenswrapper[4732]: I0131 09:34:48.492652 4732 generic.go:334] "Generic (PLEG): container finished" podID="7d790207-d357-4b47-87bf-5b505e061820" containerID="9ab32081e481444933bd2595cf8dbdf7ca3690cb1b45450fb70f36da385fe1c0" exitCode=0 Jan 31 09:34:48 crc kubenswrapper[4732]: I0131 09:34:48.492707 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerDied","Data":"9ab32081e481444933bd2595cf8dbdf7ca3690cb1b45450fb70f36da385fe1c0"} Jan 31 09:34:48 crc kubenswrapper[4732]: I0131 09:34:48.493187 4732 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-jnbt8" event={"ID":"7d790207-d357-4b47-87bf-5b505e061820","Type":"ContainerStarted","Data":"07e2ea90317216d5e6b666282e12273c6b54878cbc22c9597027546751c09788"} Jan 31 09:34:48 crc kubenswrapper[4732]: I0131 09:34:48.493224 4732 scope.go:117] "RemoveContainer" containerID="3dcf08eedc212c9c2a071b251d94a53241ef18f3503ef5e5c6a6b83f842c1e77"